Prosthetic limbs are designed to move naturally like human limbs. To reduce the feeling of abnormality when moving prosthetic limbs, previous studies have altered the appearance of robotic limbs along with the speed of the motors inside these limbs. The use of the wearer's emotions to control prosthetic limbs would allow robotic arms and legs to move more naturally. Recently, electroencephalograms (EEGs) have been used to estimate human emotions. In this study, we developed an emotion analyzer and applied it in a robotic arm system that changes its movement based on the user's emotions. In the proposed system, EEG data are measured by a simplified EEG headset and transferred to an iPad. An iPad application then transforms the signal to a value of 'wakuwaku' feeling using the band-pass filtering, fast Fourier transform, and support vector machine methods. The accuracy of the algorithm estimating the wakuwaku feeling was 75.10%, and the robot arm moved accurately based on the user emotion estimated from the EEG data. Our findings are expected to lead to a new field of study focused on controlling robot arms using human emotions.