GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze

Yun Suen Pai, Benjamin I. Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, Kai Steven Kunze

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

Viewing 360-degree-images and videos through head-mounted displays (HMDs) currently lacks a compelling interface to transition between them. We propose GazeSphere; a navigation system that provides a seamless transition between 360-degree-video environment locations through the use of orbit-like motion, via head rotation and eye gaze tracking. The significance of this approach is threefold: 1) It allows navigation and transition through spatially continuous 360-video environments, 2) It leverages the human's proprioceptive sense of rotation for locomotion that is intuitive and negates motion sickness, and 3) it uses eye tracking for a completely seamless, hands-free, and unobtrusive interaction. The proposed method uses an orbital motion technique for navigation in virtual space, which we demonstrate in applications such as navigation and interaction in computer aided design (CAD), data visualization, as a game mechanic, and for virtual tours.

Original languageEnglish
Title of host publicationACM SIGGRAPH 2017 Posters, SIGGRAPH 2017
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450350150
DOIs
Publication statusPublished - 2017 Jul 30
Event44th International Conference on Computer Graphics and Interactive Techniques, ACM SIGGRAPH 2017 - Los Angeles, United States
Duration: 2017 Jul 302017 Aug 3

Other

Other44th International Conference on Computer Graphics and Interactive Techniques, ACM SIGGRAPH 2017
CountryUnited States
CityLos Angeles
Period17/7/3017/8/3

Fingerprint

Navigation
Data visualization
Navigation systems
Computer aided design
Mechanics
Orbits
Display devices

Keywords

  • 360-degree-video
  • Eye tracking
  • Orbital navigation
  • Virtual reality

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Software
  • Computer Graphics and Computer-Aided Design

Cite this

Pai, Y. S., Outram, B. I., Tag, B., Isogai, M., Ochi, D., & Kunze, K. S. (2017). GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze. In ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017 [a23] Association for Computing Machinery, Inc. https://doi.org/10.1145/3102163.3102183

GazeSphere : Navigating 360-degree-video environments in VR using head rotation and eye gaze. / Pai, Yun Suen; Outram, Benjamin I.; Tag, Benjamin; Isogai, Megumi; Ochi, Daisuke; Kunze, Kai Steven.

ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017. Association for Computing Machinery, Inc, 2017. a23.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Pai, YS, Outram, BI, Tag, B, Isogai, M, Ochi, D & Kunze, KS 2017, GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze. in ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017., a23, Association for Computing Machinery, Inc, 44th International Conference on Computer Graphics and Interactive Techniques, ACM SIGGRAPH 2017 , Los Angeles, United States, 17/7/30. https://doi.org/10.1145/3102163.3102183
Pai YS, Outram BI, Tag B, Isogai M, Ochi D, Kunze KS. GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze. In ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017. Association for Computing Machinery, Inc. 2017. a23 https://doi.org/10.1145/3102163.3102183
Pai, Yun Suen ; Outram, Benjamin I. ; Tag, Benjamin ; Isogai, Megumi ; Ochi, Daisuke ; Kunze, Kai Steven. / GazeSphere : Navigating 360-degree-video environments in VR using head rotation and eye gaze. ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017. Association for Computing Machinery, Inc, 2017.
@inproceedings{bd44f75b593f4a2da8479ed0716c9f22,
title = "GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze",
abstract = "Viewing 360-degree-images and videos through head-mounted displays (HMDs) currently lacks a compelling interface to transition between them. We propose GazeSphere; a navigation system that provides a seamless transition between 360-degree-video environment locations through the use of orbit-like motion, via head rotation and eye gaze tracking. The significance of this approach is threefold: 1) It allows navigation and transition through spatially continuous 360-video environments, 2) It leverages the human's proprioceptive sense of rotation for locomotion that is intuitive and negates motion sickness, and 3) it uses eye tracking for a completely seamless, hands-free, and unobtrusive interaction. The proposed method uses an orbital motion technique for navigation in virtual space, which we demonstrate in applications such as navigation and interaction in computer aided design (CAD), data visualization, as a game mechanic, and for virtual tours.",
keywords = "360-degree-video, Eye tracking, Orbital navigation, Virtual reality",
author = "Pai, {Yun Suen} and Outram, {Benjamin I.} and Benjamin Tag and Megumi Isogai and Daisuke Ochi and Kunze, {Kai Steven}",
year = "2017",
month = "7",
day = "30",
doi = "10.1145/3102163.3102183",
language = "English",
booktitle = "ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017",
publisher = "Association for Computing Machinery, Inc",

}

TY - GEN

T1 - GazeSphere

T2 - Navigating 360-degree-video environments in VR using head rotation and eye gaze

AU - Pai, Yun Suen

AU - Outram, Benjamin I.

AU - Tag, Benjamin

AU - Isogai, Megumi

AU - Ochi, Daisuke

AU - Kunze, Kai Steven

PY - 2017/7/30

Y1 - 2017/7/30

N2 - Viewing 360-degree-images and videos through head-mounted displays (HMDs) currently lacks a compelling interface to transition between them. We propose GazeSphere; a navigation system that provides a seamless transition between 360-degree-video environment locations through the use of orbit-like motion, via head rotation and eye gaze tracking. The significance of this approach is threefold: 1) It allows navigation and transition through spatially continuous 360-video environments, 2) It leverages the human's proprioceptive sense of rotation for locomotion that is intuitive and negates motion sickness, and 3) it uses eye tracking for a completely seamless, hands-free, and unobtrusive interaction. The proposed method uses an orbital motion technique for navigation in virtual space, which we demonstrate in applications such as navigation and interaction in computer aided design (CAD), data visualization, as a game mechanic, and for virtual tours.

AB - Viewing 360-degree-images and videos through head-mounted displays (HMDs) currently lacks a compelling interface to transition between them. We propose GazeSphere; a navigation system that provides a seamless transition between 360-degree-video environment locations through the use of orbit-like motion, via head rotation and eye gaze tracking. The significance of this approach is threefold: 1) It allows navigation and transition through spatially continuous 360-video environments, 2) It leverages the human's proprioceptive sense of rotation for locomotion that is intuitive and negates motion sickness, and 3) it uses eye tracking for a completely seamless, hands-free, and unobtrusive interaction. The proposed method uses an orbital motion technique for navigation in virtual space, which we demonstrate in applications such as navigation and interaction in computer aided design (CAD), data visualization, as a game mechanic, and for virtual tours.

KW - 360-degree-video

KW - Eye tracking

KW - Orbital navigation

KW - Virtual reality

UR - http://www.scopus.com/inward/record.url?scp=85028587022&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85028587022&partnerID=8YFLogxK

U2 - 10.1145/3102163.3102183

DO - 10.1145/3102163.3102183

M3 - Conference contribution

AN - SCOPUS:85028587022

BT - ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017

PB - Association for Computing Machinery, Inc

ER -