TY - GEN
T1 - Eye-based Interaction Using Embedded Optical Sensors on an Eyewear Device for Facial Expression Recognition
AU - Masai, Katsutoshi
AU - Kunze, Kai
AU - Sugimoto, Maki
N1 - Funding Information:
This research was partially supported by JST CREST JP-MJCR14E1, and Keio KLL research grant.
Publisher Copyright:
© 2020 ACM.
PY - 2020/3/16
Y1 - 2020/3/16
N2 - Non-verbal information is essential to understand intentions and emotions and to facilitate social interaction between humans and between humans and computers. One reliable source of such information is the eyes. We investigated the eye-based interaction (recognizing eye gestures or eye movements) using an eyewear device for facial expression recognition. The device incorporates 16 low-cost optical sensors. The system allows hands-free interaction in many situations. Using the device, we evaluated three eye-based interactions. First, we evaluated the accuracy of detecting the gestures with nine participants. The average accuracy of detecting seven different eye gestures is 89.1% with user-dependent training. We used dynamic time warping (DTW) for gesture recognition. Second, we evaluated the accuracy of eye gaze position estimation with five users holding a neutral face. The system showed potential to track the approximate direction of the eyes, with higher accuracy in detecting position y than x. Finally, we did a feasibility study of one user reading jokes while wearing the device. The system was capable of analyzing facial expressions and eye movements in daily contexts.
AB - Non-verbal information is essential to understand intentions and emotions and to facilitate social interaction between humans and between humans and computers. One reliable source of such information is the eyes. We investigated the eye-based interaction (recognizing eye gestures or eye movements) using an eyewear device for facial expression recognition. The device incorporates 16 low-cost optical sensors. The system allows hands-free interaction in many situations. Using the device, we evaluated three eye-based interactions. First, we evaluated the accuracy of detecting the gestures with nine participants. The average accuracy of detecting seven different eye gestures is 89.1% with user-dependent training. We used dynamic time warping (DTW) for gesture recognition. Second, we evaluated the accuracy of eye gaze position estimation with five users holding a neutral face. The system showed potential to track the approximate direction of the eyes, with higher accuracy in detecting position y than x. Finally, we did a feasibility study of one user reading jokes while wearing the device. The system was capable of analyzing facial expressions and eye movements in daily contexts.
KW - eyewear computing
KW - gaze gesture
KW - wearable computing
UR - http://www.scopus.com/inward/record.url?scp=85123042770&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123042770&partnerID=8YFLogxK
U2 - 10.1145/3384657.3384787
DO - 10.1145/3384657.3384787
M3 - Conference contribution
AN - SCOPUS:85123042770
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the Augmented Humans International Conference, AHs 2020
PB - Association for Computing Machinery
T2 - 2020 Augmented Humans International Conference, AHs 2020
Y2 - 16 March 2020 through 17 March 2020
ER -