Analysis of multiple users' experience in daily life using wearable device for facial expression recognition

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we present a wearable facial expression recognition system that can analyse and enhance a daily experience. Our aim is to create a mindful experience in daily life by connecting the device with everyday objects and service. To this end, we made two prototypes that supports users to keep right side of emotions: 1) a text chatting system that automatically inserts an emoticon based on his/her facial expressions in the end of a comment a user typed, 2) a plant interface controlled by facial expressions. We also analysed multiple users' facial expressions while they played video games. We confirmed that visualization of sensor data from the device shows the possibility for estimating the transition of different facial expressions.

Original languageEnglish
Title of host publicationProceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016
PublisherAssociation for Computing Machinery
VolumePart F125634
ISBN (Electronic)9781450347730
DOIs
Publication statusPublished - 2016 Nov 9
Event13th International Conference on Advances in Computer Entertainment Technology, ACE 2016 - Osaka, Japan
Duration: 2016 Nov 92016 Nov 12

Other

Other13th International Conference on Advances in Computer Entertainment Technology, ACE 2016
CountryJapan
CityOsaka
Period16/11/916/11/12

Fingerprint

Visualization
Sensors

Keywords

  • Affective computing
  • Facial expression
  • Social interaction
  • Wearables

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Masai, K., Itoh, Y., Sugiura, Y., & Sugimoto, M. (2016). Analysis of multiple users' experience in daily life using wearable device for facial expression recognition. In Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016 (Vol. Part F125634). [a52] Association for Computing Machinery. https://doi.org/10.1145/3001773.3014351

Analysis of multiple users' experience in daily life using wearable device for facial expression recognition. / Masai, Katsutoshi; Itoh, Yuta; Sugiura, Yuta; Sugimoto, Maki.

Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016. Vol. Part F125634 Association for Computing Machinery, 2016. a52.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Masai, K, Itoh, Y, Sugiura, Y & Sugimoto, M 2016, Analysis of multiple users' experience in daily life using wearable device for facial expression recognition. in Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016. vol. Part F125634, a52, Association for Computing Machinery, 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016, Osaka, Japan, 16/11/9. https://doi.org/10.1145/3001773.3014351
Masai K, Itoh Y, Sugiura Y, Sugimoto M. Analysis of multiple users' experience in daily life using wearable device for facial expression recognition. In Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016. Vol. Part F125634. Association for Computing Machinery. 2016. a52 https://doi.org/10.1145/3001773.3014351
Masai, Katsutoshi ; Itoh, Yuta ; Sugiura, Yuta ; Sugimoto, Maki. / Analysis of multiple users' experience in daily life using wearable device for facial expression recognition. Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016. Vol. Part F125634 Association for Computing Machinery, 2016.
@inproceedings{a5e5c238377641f7ba3c75f7591a287b,
title = "Analysis of multiple users' experience in daily life using wearable device for facial expression recognition",
abstract = "In this paper, we present a wearable facial expression recognition system that can analyse and enhance a daily experience. Our aim is to create a mindful experience in daily life by connecting the device with everyday objects and service. To this end, we made two prototypes that supports users to keep right side of emotions: 1) a text chatting system that automatically inserts an emoticon based on his/her facial expressions in the end of a comment a user typed, 2) a plant interface controlled by facial expressions. We also analysed multiple users' facial expressions while they played video games. We confirmed that visualization of sensor data from the device shows the possibility for estimating the transition of different facial expressions.",
keywords = "Affective computing, Facial expression, Social interaction, Wearables",
author = "Katsutoshi Masai and Yuta Itoh and Yuta Sugiura and Maki Sugimoto",
year = "2016",
month = "11",
day = "9",
doi = "10.1145/3001773.3014351",
language = "English",
volume = "Part F125634",
booktitle = "Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Analysis of multiple users' experience in daily life using wearable device for facial expression recognition

AU - Masai, Katsutoshi

AU - Itoh, Yuta

AU - Sugiura, Yuta

AU - Sugimoto, Maki

PY - 2016/11/9

Y1 - 2016/11/9

N2 - In this paper, we present a wearable facial expression recognition system that can analyse and enhance a daily experience. Our aim is to create a mindful experience in daily life by connecting the device with everyday objects and service. To this end, we made two prototypes that supports users to keep right side of emotions: 1) a text chatting system that automatically inserts an emoticon based on his/her facial expressions in the end of a comment a user typed, 2) a plant interface controlled by facial expressions. We also analysed multiple users' facial expressions while they played video games. We confirmed that visualization of sensor data from the device shows the possibility for estimating the transition of different facial expressions.

AB - In this paper, we present a wearable facial expression recognition system that can analyse and enhance a daily experience. Our aim is to create a mindful experience in daily life by connecting the device with everyday objects and service. To this end, we made two prototypes that supports users to keep right side of emotions: 1) a text chatting system that automatically inserts an emoticon based on his/her facial expressions in the end of a comment a user typed, 2) a plant interface controlled by facial expressions. We also analysed multiple users' facial expressions while they played video games. We confirmed that visualization of sensor data from the device shows the possibility for estimating the transition of different facial expressions.

KW - Affective computing

KW - Facial expression

KW - Social interaction

KW - Wearables

UR - http://www.scopus.com/inward/record.url?scp=85014703485&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85014703485&partnerID=8YFLogxK

U2 - 10.1145/3001773.3014351

DO - 10.1145/3001773.3014351

M3 - Conference contribution

AN - SCOPUS:85014703485

VL - Part F125634

BT - Proceedings - 13th International Conference on Advances in Computer Entertainment Technology, ACE 2016

PB - Association for Computing Machinery

ER -