Eye blink as an input modality for a responsive adaptable video system

Benjamin Tag, Junichi Shimizu, Chi Zhang, Naohisa Ohta, Kai Kunze, Kazunori Sugiura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Citations (Scopus)

Abstract

We propose a unique system that allows real-Time adaption of video settings to a viewer's physical state. A custom made program toggles between videos according to the average eye blink frequency of each viewer. The physical data is harnessed with J!NS MEME smart glasses that utilize electrooculography (EOG). To the best of our belief, this is the first adaptable multimedia system that responds in real time to physical data and alters technical settings of video contents.

Original languageEnglish
Title of host publicationUbiComp 2016 Adjunct - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing
PublisherAssociation for Computing Machinery, Inc
Pages205-208
Number of pages4
ISBN (Electronic)9781450344623
DOIs
Publication statusPublished - 2016 Sept 12
Event2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2016 - Heidelberg, Germany
Duration: 2016 Sept 122016 Sept 16

Other

Other2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2016
Country/TerritoryGermany
CityHeidelberg
Period16/9/1216/9/16

Keywords

  • Activity Recognition
  • Adaptable Video
  • Eye Blink
  • Eyewear
  • Psychophysics
  • Smart Glasses

ASJC Scopus subject areas

  • Hardware and Architecture
  • Software
  • Information Systems
  • Computer Networks and Communications
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Eye blink as an input modality for a responsive adaptable video system'. Together they form a unique fingerprint.

Cite this