Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display

Katsuhiro Suzuki, Fumihiko Nakamura, Jiu Otsuka, Katsutoshi Masai, Yuta Itoh, Yuta Sugiura, Maki Sugimoto

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMD allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using retro-reflective photoelectric sensors. The sensors attached inside the HMD measures the distance between the sensors and the user's face. The distance values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training the neural network to estimate the facial expression of a user. We achieved an overall accuracy of 88% in recognizing the facial expressions. Our system can also reproduce facial expression change in real-time through an existing avatar using regression. Consequently, our system enables estimation and reconstruction of facial expressions that correspond to the user's emotional changes.

Original languageEnglish
Title of host publication2017 IEEE Virtual Reality, VR 2017 - Proceedings
PublisherIEEE Computer Society
Pages177-185
Number of pages9
ISBN (Electronic)9781509066476
DOIs
Publication statusPublished - 2017 Apr 4
Event19th IEEE Virtual Reality, VR 2017 - Los Angeles, United States
Duration: 2017 Mar 182017 Mar 22

Other

Other19th IEEE Virtual Reality, VR 2017
CountryUnited States
CityLos Angeles
Period17/3/1817/3/22

Fingerprint

Display devices
Sensors
Virtual reality
Synchronization
Neural networks

Keywords

  • H.5.m. [Information interfaces and presentation (e.g. HCI)]
  • Miscellaneous

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Suzuki, K., Nakamura, F., Otsuka, J., Masai, K., Itoh, Y., Sugiura, Y., & Sugimoto, M. (2017). Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display. In 2017 IEEE Virtual Reality, VR 2017 - Proceedings (pp. 177-185). [7892245] IEEE Computer Society. https://doi.org/10.1109/VR.2017.7892245

Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display. / Suzuki, Katsuhiro; Nakamura, Fumihiko; Otsuka, Jiu; Masai, Katsutoshi; Itoh, Yuta; Sugiura, Yuta; Sugimoto, Maki.

2017 IEEE Virtual Reality, VR 2017 - Proceedings. IEEE Computer Society, 2017. p. 177-185 7892245.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Suzuki, K, Nakamura, F, Otsuka, J, Masai, K, Itoh, Y, Sugiura, Y & Sugimoto, M 2017, Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display. in 2017 IEEE Virtual Reality, VR 2017 - Proceedings., 7892245, IEEE Computer Society, pp. 177-185, 19th IEEE Virtual Reality, VR 2017, Los Angeles, United States, 17/3/18. https://doi.org/10.1109/VR.2017.7892245
Suzuki K, Nakamura F, Otsuka J, Masai K, Itoh Y, Sugiura Y et al. Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display. In 2017 IEEE Virtual Reality, VR 2017 - Proceedings. IEEE Computer Society. 2017. p. 177-185. 7892245 https://doi.org/10.1109/VR.2017.7892245
Suzuki, Katsuhiro ; Nakamura, Fumihiko ; Otsuka, Jiu ; Masai, Katsutoshi ; Itoh, Yuta ; Sugiura, Yuta ; Sugimoto, Maki. / Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display. 2017 IEEE Virtual Reality, VR 2017 - Proceedings. IEEE Computer Society, 2017. pp. 177-185
@inproceedings{d53c4a041c7745ddb1f14e8bf4262e8c,
title = "Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display",
abstract = "We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMD allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using retro-reflective photoelectric sensors. The sensors attached inside the HMD measures the distance between the sensors and the user's face. The distance values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training the neural network to estimate the facial expression of a user. We achieved an overall accuracy of 88{\%} in recognizing the facial expressions. Our system can also reproduce facial expression change in real-time through an existing avatar using regression. Consequently, our system enables estimation and reconstruction of facial expressions that correspond to the user's emotional changes.",
keywords = "H.5.m. [Information interfaces and presentation (e.g. HCI)], Miscellaneous",
author = "Katsuhiro Suzuki and Fumihiko Nakamura and Jiu Otsuka and Katsutoshi Masai and Yuta Itoh and Yuta Sugiura and Maki Sugimoto",
year = "2017",
month = "4",
day = "4",
doi = "10.1109/VR.2017.7892245",
language = "English",
pages = "177--185",
booktitle = "2017 IEEE Virtual Reality, VR 2017 - Proceedings",
publisher = "IEEE Computer Society",

}

TY - GEN

T1 - Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display

AU - Suzuki, Katsuhiro

AU - Nakamura, Fumihiko

AU - Otsuka, Jiu

AU - Masai, Katsutoshi

AU - Itoh, Yuta

AU - Sugiura, Yuta

AU - Sugimoto, Maki

PY - 2017/4/4

Y1 - 2017/4/4

N2 - We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMD allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using retro-reflective photoelectric sensors. The sensors attached inside the HMD measures the distance between the sensors and the user's face. The distance values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training the neural network to estimate the facial expression of a user. We achieved an overall accuracy of 88% in recognizing the facial expressions. Our system can also reproduce facial expression change in real-time through an existing avatar using regression. Consequently, our system enables estimation and reconstruction of facial expressions that correspond to the user's emotional changes.

AB - We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMD allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using retro-reflective photoelectric sensors. The sensors attached inside the HMD measures the distance between the sensors and the user's face. The distance values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training the neural network to estimate the facial expression of a user. We achieved an overall accuracy of 88% in recognizing the facial expressions. Our system can also reproduce facial expression change in real-time through an existing avatar using regression. Consequently, our system enables estimation and reconstruction of facial expressions that correspond to the user's emotional changes.

KW - H.5.m. [Information interfaces and presentation (e.g. HCI)]

KW - Miscellaneous

UR - http://www.scopus.com/inward/record.url?scp=85018430089&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85018430089&partnerID=8YFLogxK

U2 - 10.1109/VR.2017.7892245

DO - 10.1109/VR.2017.7892245

M3 - Conference contribution

AN - SCOPUS:85018430089

SP - 177

EP - 185

BT - 2017 IEEE Virtual Reality, VR 2017 - Proceedings

PB - IEEE Computer Society

ER -