TY - GEN
T1 - An EEG-based robot arm control to express human emotions
AU - Ogino, Mikito
AU - Mitsukura, Yasue
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/6/1
Y1 - 2018/6/1
N2 - Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.
AB - Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.
KW - Brain
KW - Electroencephalogram
KW - Emotion
KW - Robot arm
UR - http://www.scopus.com/inward/record.url?scp=85048775176&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048775176&partnerID=8YFLogxK
U2 - 10.1109/AMC.2019.8371111
DO - 10.1109/AMC.2019.8371111
M3 - Conference contribution
AN - SCOPUS:85048775176
T3 - Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018
SP - 322
EP - 327
BT - Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 15th IEEE International Workshop on Advanced Motion Control, AMC 2018
Y2 - 9 March 2018 through 11 March 2018
ER -