Object-wise 3d gaze mapping in physicalworkspace

Kakeru Hagihara, Keiichiro Taniguchi, Irshad Abibouraguimane, Yuta Itoh, Keita Higuchi, Jiu Otsuka, Maki Sugimoto, Yoichi Sato

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

skill in human communication. Eye behavior is an important, yet implicit communication cue. In this work, we focus on enabling people to see the users' gaze associated with objects in the 3D space, namely, we present users the history of gaze linked to real 3D objects. Our 3D gaze visualization system automatically segments objects in the workspace and projects user's gaze trajectory onto the objects in 3D for visualizing user's intention. By combining automated object segmentation and head tracking via the firstperson video from a wearable eye tracker, our system can visualize user's gaze behavior more intuitively and efficiently compared to 2D based methods and 3D methods with manual annotation. We performed an evaluation of the system to measure the accuracy of object-wise gaze mapping. In the evaluation, the system achieved 94% accuracy of gaze mapping onto 40, 30, 20, 10-centimeter cubes. We also conducted a case study of through a case study where the user looks at food products, we showed that our system was ableto predict products that the user is interested in.

Original languageEnglish
Title of host publicationProceedings of the 9th Augmented Human International Conference, AH 2018
PublisherAssociation for Computing Machinery
VolumePart F134484
ISBN (Electronic)9781450354158
DOIs
Publication statusPublished - 2018 Feb 6
Event9th Augmented Human International Conference, AH 2018 - Seoul, Korea, Republic of
Duration: 2018 Feb 72018 Feb 9

Other

Other9th Augmented Human International Conference, AH 2018
CountryKorea, Republic of
CitySeoul
Period18/2/718/2/9

Fingerprint

Communication
Visualization
Trajectories

Keywords

  • 3D gaze
  • 3D segmentation
  • Gaze mapping

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Hagihara, K., Taniguchi, K., Abibouraguimane, I., Itoh, Y., Higuchi, K., Otsuka, J., ... Sato, Y. (2018). Object-wise 3d gaze mapping in physicalworkspace. In Proceedings of the 9th Augmented Human International Conference, AH 2018 (Vol. Part F134484). [a25] Association for Computing Machinery. https://doi.org/10.1145/3174910.3174921

Object-wise 3d gaze mapping in physicalworkspace. / Hagihara, Kakeru; Taniguchi, Keiichiro; Abibouraguimane, Irshad; Itoh, Yuta; Higuchi, Keita; Otsuka, Jiu; Sugimoto, Maki; Sato, Yoichi.

Proceedings of the 9th Augmented Human International Conference, AH 2018. Vol. Part F134484 Association for Computing Machinery, 2018. a25.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hagihara, K, Taniguchi, K, Abibouraguimane, I, Itoh, Y, Higuchi, K, Otsuka, J, Sugimoto, M & Sato, Y 2018, Object-wise 3d gaze mapping in physicalworkspace. in Proceedings of the 9th Augmented Human International Conference, AH 2018. vol. Part F134484, a25, Association for Computing Machinery, 9th Augmented Human International Conference, AH 2018, Seoul, Korea, Republic of, 18/2/7. https://doi.org/10.1145/3174910.3174921
Hagihara K, Taniguchi K, Abibouraguimane I, Itoh Y, Higuchi K, Otsuka J et al. Object-wise 3d gaze mapping in physicalworkspace. In Proceedings of the 9th Augmented Human International Conference, AH 2018. Vol. Part F134484. Association for Computing Machinery. 2018. a25 https://doi.org/10.1145/3174910.3174921
Hagihara, Kakeru ; Taniguchi, Keiichiro ; Abibouraguimane, Irshad ; Itoh, Yuta ; Higuchi, Keita ; Otsuka, Jiu ; Sugimoto, Maki ; Sato, Yoichi. / Object-wise 3d gaze mapping in physicalworkspace. Proceedings of the 9th Augmented Human International Conference, AH 2018. Vol. Part F134484 Association for Computing Machinery, 2018.
@inproceedings{e0fee9cc92674f42bedf401e5084a178,
title = "Object-wise 3d gaze mapping in physicalworkspace",
abstract = "skill in human communication. Eye behavior is an important, yet implicit communication cue. In this work, we focus on enabling people to see the users' gaze associated with objects in the 3D space, namely, we present users the history of gaze linked to real 3D objects. Our 3D gaze visualization system automatically segments objects in the workspace and projects user's gaze trajectory onto the objects in 3D for visualizing user's intention. By combining automated object segmentation and head tracking via the firstperson video from a wearable eye tracker, our system can visualize user's gaze behavior more intuitively and efficiently compared to 2D based methods and 3D methods with manual annotation. We performed an evaluation of the system to measure the accuracy of object-wise gaze mapping. In the evaluation, the system achieved 94{\%} accuracy of gaze mapping onto 40, 30, 20, 10-centimeter cubes. We also conducted a case study of through a case study where the user looks at food products, we showed that our system was ableto predict products that the user is interested in.",
keywords = "3D gaze, 3D segmentation, Gaze mapping",
author = "Kakeru Hagihara and Keiichiro Taniguchi and Irshad Abibouraguimane and Yuta Itoh and Keita Higuchi and Jiu Otsuka and Maki Sugimoto and Yoichi Sato",
year = "2018",
month = "2",
day = "6",
doi = "10.1145/3174910.3174921",
language = "English",
volume = "Part F134484",
booktitle = "Proceedings of the 9th Augmented Human International Conference, AH 2018",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Object-wise 3d gaze mapping in physicalworkspace

AU - Hagihara, Kakeru

AU - Taniguchi, Keiichiro

AU - Abibouraguimane, Irshad

AU - Itoh, Yuta

AU - Higuchi, Keita

AU - Otsuka, Jiu

AU - Sugimoto, Maki

AU - Sato, Yoichi

PY - 2018/2/6

Y1 - 2018/2/6

N2 - skill in human communication. Eye behavior is an important, yet implicit communication cue. In this work, we focus on enabling people to see the users' gaze associated with objects in the 3D space, namely, we present users the history of gaze linked to real 3D objects. Our 3D gaze visualization system automatically segments objects in the workspace and projects user's gaze trajectory onto the objects in 3D for visualizing user's intention. By combining automated object segmentation and head tracking via the firstperson video from a wearable eye tracker, our system can visualize user's gaze behavior more intuitively and efficiently compared to 2D based methods and 3D methods with manual annotation. We performed an evaluation of the system to measure the accuracy of object-wise gaze mapping. In the evaluation, the system achieved 94% accuracy of gaze mapping onto 40, 30, 20, 10-centimeter cubes. We also conducted a case study of through a case study where the user looks at food products, we showed that our system was ableto predict products that the user is interested in.

AB - skill in human communication. Eye behavior is an important, yet implicit communication cue. In this work, we focus on enabling people to see the users' gaze associated with objects in the 3D space, namely, we present users the history of gaze linked to real 3D objects. Our 3D gaze visualization system automatically segments objects in the workspace and projects user's gaze trajectory onto the objects in 3D for visualizing user's intention. By combining automated object segmentation and head tracking via the firstperson video from a wearable eye tracker, our system can visualize user's gaze behavior more intuitively and efficiently compared to 2D based methods and 3D methods with manual annotation. We performed an evaluation of the system to measure the accuracy of object-wise gaze mapping. In the evaluation, the system achieved 94% accuracy of gaze mapping onto 40, 30, 20, 10-centimeter cubes. We also conducted a case study of through a case study where the user looks at food products, we showed that our system was ableto predict products that the user is interested in.

KW - 3D gaze

KW - 3D segmentation

KW - Gaze mapping

UR - http://www.scopus.com/inward/record.url?scp=85044317398&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85044317398&partnerID=8YFLogxK

U2 - 10.1145/3174910.3174921

DO - 10.1145/3174910.3174921

M3 - Conference contribution

AN - SCOPUS:85044317398

VL - Part F134484

BT - Proceedings of the 9th Augmented Human International Conference, AH 2018

PB - Association for Computing Machinery

ER -