An EEG-based robot arm control to express human emotions

Mikito Ogino, Yasue Mitsukura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)


Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.

Original languageEnglish
Title of host publicationProceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781538619469
Publication statusPublished - 2018 Jun 1
Event15th IEEE International Workshop on Advanced Motion Control, AMC 2018 - Tokyo, Japan
Duration: 2018 Mar 92018 Mar 11


Other15th IEEE International Workshop on Advanced Motion Control, AMC 2018


  • Brain
  • Electroencephalogram
  • Emotion
  • Robot arm

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Mechanical Engineering
  • Control and Optimization


Dive into the research topics of 'An EEG-based robot arm control to express human emotions'. Together they form a unique fingerprint.

Cite this