An EEG-based robot arm control to express human emotions

Mikito Ogino, Yasue Mitsukura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.

Original languageEnglish
Title of host publicationProceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages322-327
Number of pages6
ISBN (Electronic)9781538619469
DOIs
Publication statusPublished - 2018 Jun 1
Event15th IEEE International Workshop on Advanced Motion Control, AMC 2018 - Tokyo, Japan
Duration: 2018 Mar 92018 Mar 11

Other

Other15th IEEE International Workshop on Advanced Motion Control, AMC 2018
CountryJapan
CityTokyo
Period18/3/918/3/11

Fingerprint

Robot Control
Electroencephalography
Express
Prosthetics
Robots
Robotic arms
Robotics
Robot
Fast Fourier transforms
Support vector machines
Fast Fourier transform
Human
Electroencephalogram
Emotion
Support Vector Machine
Filtering
Transform
Estimate

Keywords

  • Brain
  • Electroencephalogram
  • Emotion
  • Robot arm

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Mechanical Engineering
  • Control and Optimization

Cite this

Ogino, M., & Mitsukura, Y. (2018). An EEG-based robot arm control to express human emotions. In Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018 (pp. 322-327). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/AMC.2019.8371111

An EEG-based robot arm control to express human emotions. / Ogino, Mikito; Mitsukura, Yasue.

Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018. Institute of Electrical and Electronics Engineers Inc., 2018. p. 322-327.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ogino, M & Mitsukura, Y 2018, An EEG-based robot arm control to express human emotions. in Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018. Institute of Electrical and Electronics Engineers Inc., pp. 322-327, 15th IEEE International Workshop on Advanced Motion Control, AMC 2018, Tokyo, Japan, 18/3/9. https://doi.org/10.1109/AMC.2019.8371111
Ogino M, Mitsukura Y. An EEG-based robot arm control to express human emotions. In Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018. Institute of Electrical and Electronics Engineers Inc. 2018. p. 322-327 https://doi.org/10.1109/AMC.2019.8371111
Ogino, Mikito ; Mitsukura, Yasue. / An EEG-based robot arm control to express human emotions. Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018. Institute of Electrical and Electronics Engineers Inc., 2018. pp. 322-327
@inproceedings{4aee9e002e104d868228eb15e990bcb6,
title = "An EEG-based robot arm control to express human emotions",
abstract = "Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10{\%}, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.",
keywords = "Brain, Electroencephalogram, Emotion, Robot arm",
author = "Mikito Ogino and Yasue Mitsukura",
year = "2018",
month = "6",
day = "1",
doi = "10.1109/AMC.2019.8371111",
language = "English",
pages = "322--327",
booktitle = "Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - An EEG-based robot arm control to express human emotions

AU - Ogino, Mikito

AU - Mitsukura, Yasue

PY - 2018/6/1

Y1 - 2018/6/1

N2 - Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.

AB - Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.

KW - Brain

KW - Electroencephalogram

KW - Emotion

KW - Robot arm

UR - http://www.scopus.com/inward/record.url?scp=85048775176&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85048775176&partnerID=8YFLogxK

U2 - 10.1109/AMC.2019.8371111

DO - 10.1109/AMC.2019.8371111

M3 - Conference contribution

SP - 322

EP - 327

BT - Proceedings - 2018 IEEE 15th International Workshop on Advanced Motion Control, AMC 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -