Empathy Glasses

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

In this paper, we describe Empathy Glasses, a head worn prototype designed to create an empathic connection between remote collaborators. The main novelty of our system is that it is the first to combine the following technologies together: (1) wearable facial expression capture hardware, (2) eye tracking, (3) a head worn camera, and (4) a seethrough head mounted display, with a focus on remote collaboration. Using the system, a local user can send their information and a view of their environment to a remote helper who can send back visual cues on the local user's see-through display to help them perform a real world task. A pilot user study was conducted to explore how effective the Empathy Glasses were at supporting remote collaboration. We describe the implications that can be drawn from this user study.

Original languageEnglish
Title of host publicationCHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
Pages1257-1263
Number of pages7
Volume07-12-May-2016
ISBN (Electronic)9781450340823
DOIs
Publication statusPublished - 2016 May 7
Event34th Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2016 - San Jose, United States
Duration: 2016 May 72016 May 12

Other

Other34th Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2016
CountryUnited States
CitySan Jose
Period16/5/716/5/12

Fingerprint

Glass
Cameras
Display devices
Hardware

Keywords

  • Emotional interface
  • Facial expression
  • Remote collaboration
  • Wearables

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design
  • Software

Cite this

Masai, K., Sugimoto, M., Kunze, K. S., & Billinghurst, M. (2016). Empathy Glasses. In CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems (Vol. 07-12-May-2016, pp. 1257-1263). Association for Computing Machinery. https://doi.org/10.1145/2851581.2892370

Empathy Glasses. / Masai, Katsutoshi; Sugimoto, Maki; Kunze, Kai Steven; Billinghurst, Mark.

CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems. Vol. 07-12-May-2016 Association for Computing Machinery, 2016. p. 1257-1263.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Masai, K, Sugimoto, M, Kunze, KS & Billinghurst, M 2016, Empathy Glasses. in CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems. vol. 07-12-May-2016, Association for Computing Machinery, pp. 1257-1263, 34th Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2016, San Jose, United States, 16/5/7. https://doi.org/10.1145/2851581.2892370
Masai K, Sugimoto M, Kunze KS, Billinghurst M. Empathy Glasses. In CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems. Vol. 07-12-May-2016. Association for Computing Machinery. 2016. p. 1257-1263 https://doi.org/10.1145/2851581.2892370
Masai, Katsutoshi ; Sugimoto, Maki ; Kunze, Kai Steven ; Billinghurst, Mark. / Empathy Glasses. CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems. Vol. 07-12-May-2016 Association for Computing Machinery, 2016. pp. 1257-1263
@inproceedings{df63ceb5033241d7b0fd46b3b015c83c,
title = "Empathy Glasses",
abstract = "In this paper, we describe Empathy Glasses, a head worn prototype designed to create an empathic connection between remote collaborators. The main novelty of our system is that it is the first to combine the following technologies together: (1) wearable facial expression capture hardware, (2) eye tracking, (3) a head worn camera, and (4) a seethrough head mounted display, with a focus on remote collaboration. Using the system, a local user can send their information and a view of their environment to a remote helper who can send back visual cues on the local user's see-through display to help them perform a real world task. A pilot user study was conducted to explore how effective the Empathy Glasses were at supporting remote collaboration. We describe the implications that can be drawn from this user study.",
keywords = "Emotional interface, Facial expression, Remote collaboration, Wearables",
author = "Katsutoshi Masai and Maki Sugimoto and Kunze, {Kai Steven} and Mark Billinghurst",
year = "2016",
month = "5",
day = "7",
doi = "10.1145/2851581.2892370",
language = "English",
volume = "07-12-May-2016",
pages = "1257--1263",
booktitle = "CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Empathy Glasses

AU - Masai, Katsutoshi

AU - Sugimoto, Maki

AU - Kunze, Kai Steven

AU - Billinghurst, Mark

PY - 2016/5/7

Y1 - 2016/5/7

N2 - In this paper, we describe Empathy Glasses, a head worn prototype designed to create an empathic connection between remote collaborators. The main novelty of our system is that it is the first to combine the following technologies together: (1) wearable facial expression capture hardware, (2) eye tracking, (3) a head worn camera, and (4) a seethrough head mounted display, with a focus on remote collaboration. Using the system, a local user can send their information and a view of their environment to a remote helper who can send back visual cues on the local user's see-through display to help them perform a real world task. A pilot user study was conducted to explore how effective the Empathy Glasses were at supporting remote collaboration. We describe the implications that can be drawn from this user study.

AB - In this paper, we describe Empathy Glasses, a head worn prototype designed to create an empathic connection between remote collaborators. The main novelty of our system is that it is the first to combine the following technologies together: (1) wearable facial expression capture hardware, (2) eye tracking, (3) a head worn camera, and (4) a seethrough head mounted display, with a focus on remote collaboration. Using the system, a local user can send their information and a view of their environment to a remote helper who can send back visual cues on the local user's see-through display to help them perform a real world task. A pilot user study was conducted to explore how effective the Empathy Glasses were at supporting remote collaboration. We describe the implications that can be drawn from this user study.

KW - Emotional interface

KW - Facial expression

KW - Remote collaboration

KW - Wearables

UR - http://www.scopus.com/inward/record.url?scp=84990925256&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84990925256&partnerID=8YFLogxK

U2 - 10.1145/2851581.2892370

DO - 10.1145/2851581.2892370

M3 - Conference contribution

AN - SCOPUS:84990925256

VL - 07-12-May-2016

SP - 1257

EP - 1263

BT - CHI EA 2016: #chi4good - Extended Abstracts, 34th Annual CHI Conference on Human Factors in Computing Systems

PB - Association for Computing Machinery

ER -