Face Commands - User-Defined Facial Gestures for Smart Glasses

Katsutoshi Masai, Kai Kunze, Daisuke Sakamoto, Yuta Sugiura, Maki Sugimoto

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose the use of face-related gestures involving the movement of the face, eyes, and head for augmented reality (AR). This technique allows us to use computer systems via hands-free, discreet interactions. In this paper, we present an elicitation study to explore the proper use of facial gestures for daily tasks in the context of a smart home. We used Amazon Mechanical Turk to conduct this study (N=37). Based on the proposed gestures, we report usage scenarios and complexity, proposed associations between gestures/tasks, a user-defined gesture set, and insights from the participants. We also conducted a technical feasibility study (N=13) with participants using smart eyewear to consider their uses in daily life. The device has 16 optical sensors and an inertial measurement unit (IMU). We can potentially integrate the system into optical see-through displays or other smart glasses. The results demonstrate that the device can detect eight temporal face-related gestures with a mean F1 score of 0.911 using a convolutional neural network (CNN). We also report the results of user-independent training and a one-hour recording of the experimenter testing two of the gestures.

Original languageEnglish
Title of host publicationProceedings - 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages374-386
Number of pages13
ISBN (Electronic)9781728185088
DOIs
Publication statusPublished - 2020 Nov
Event19th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2020 - Virtual, Recife/Porto de Galinhas, Brazil
Duration: 2020 Nov 92020 Nov 13

Publication series

NameProceedings - 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2020

Conference

Conference19th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2020
CountryBrazil
CityVirtual, Recife/Porto de Galinhas
Period20/11/920/11/13

Keywords

  • Human-centered computing
  • Human-centered computing
  • Interaction techniques
  • Ubiquitous and mobile computing design and evaluation methods

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Software

Fingerprint Dive into the research topics of 'Face Commands - User-Defined Facial Gestures for Smart Glasses'. Together they form a unique fingerprint.

Cite this