Smooth eye movement interaction using EOG glasses

Murtaza Dhuliawala, Juyoung Lee, Junichi Shimizu, Andreas Bulling, Kai Steven Kunze, Thad Starner, Woontack Woo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cur- sor with the eyes as the cursor travels in a circular path around each option. Using an off-The-shelf Jins MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps.

Original languageEnglish
Title of host publicationICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction
PublisherAssociation for Computing Machinery, Inc
Pages307-311
Number of pages5
ISBN (Electronic)9781450345569
DOIs
Publication statusPublished - 2016 Oct 31
Event18th ACM International Conference on Multimodal Interaction, ICMI 2016 - Tokyo, Japan
Duration: 2016 Nov 122016 Nov 16

Other

Other18th ACM International Conference on Multimodal Interaction, ICMI 2016
CountryJapan
CityTokyo
Period16/11/1216/11/16

Fingerprint

Eye movements
Orbits
Glass
Eyeglasses
Display devices
Electrodes
Sensors

Keywords

  • Eye tracking
  • Gaze interaction
  • Wearable computing

ASJC Scopus subject areas

  • Computer Science Applications
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition

Cite this

Dhuliawala, M., Lee, J., Shimizu, J., Bulling, A., Kunze, K. S., Starner, T., & Woo, W. (2016). Smooth eye movement interaction using EOG glasses. In ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction (pp. 307-311). Association for Computing Machinery, Inc. https://doi.org/10.1145/2993148.2993181

Smooth eye movement interaction using EOG glasses. / Dhuliawala, Murtaza; Lee, Juyoung; Shimizu, Junichi; Bulling, Andreas; Kunze, Kai Steven; Starner, Thad; Woo, Woontack.

ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction. Association for Computing Machinery, Inc, 2016. p. 307-311.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Dhuliawala, M, Lee, J, Shimizu, J, Bulling, A, Kunze, KS, Starner, T & Woo, W 2016, Smooth eye movement interaction using EOG glasses. in ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction. Association for Computing Machinery, Inc, pp. 307-311, 18th ACM International Conference on Multimodal Interaction, ICMI 2016, Tokyo, Japan, 16/11/12. https://doi.org/10.1145/2993148.2993181
Dhuliawala M, Lee J, Shimizu J, Bulling A, Kunze KS, Starner T et al. Smooth eye movement interaction using EOG glasses. In ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction. Association for Computing Machinery, Inc. 2016. p. 307-311 https://doi.org/10.1145/2993148.2993181
Dhuliawala, Murtaza ; Lee, Juyoung ; Shimizu, Junichi ; Bulling, Andreas ; Kunze, Kai Steven ; Starner, Thad ; Woo, Woontack. / Smooth eye movement interaction using EOG glasses. ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction. Association for Computing Machinery, Inc, 2016. pp. 307-311
@inproceedings{67183acd0e8749f9ad7addda6b2e8b74,
title = "Smooth eye movement interaction using EOG glasses",
abstract = "Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cur- sor with the eyes as the cursor travels in a circular path around each option. Using an off-The-shelf Jins MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps.",
keywords = "Eye tracking, Gaze interaction, Wearable computing",
author = "Murtaza Dhuliawala and Juyoung Lee and Junichi Shimizu and Andreas Bulling and Kunze, {Kai Steven} and Thad Starner and Woontack Woo",
year = "2016",
month = "10",
day = "31",
doi = "10.1145/2993148.2993181",
language = "English",
pages = "307--311",
booktitle = "ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction",
publisher = "Association for Computing Machinery, Inc",

}

TY - GEN

T1 - Smooth eye movement interaction using EOG glasses

AU - Dhuliawala, Murtaza

AU - Lee, Juyoung

AU - Shimizu, Junichi

AU - Bulling, Andreas

AU - Kunze, Kai Steven

AU - Starner, Thad

AU - Woo, Woontack

PY - 2016/10/31

Y1 - 2016/10/31

N2 - Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cur- sor with the eyes as the cursor travels in a circular path around each option. Using an off-The-shelf Jins MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps.

AB - Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cur- sor with the eyes as the cursor travels in a circular path around each option. Using an off-The-shelf Jins MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps.

KW - Eye tracking

KW - Gaze interaction

KW - Wearable computing

UR - http://www.scopus.com/inward/record.url?scp=85016608897&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85016608897&partnerID=8YFLogxK

U2 - 10.1145/2993148.2993181

DO - 10.1145/2993148.2993181

M3 - Conference contribution

SP - 307

EP - 311

BT - ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction

PB - Association for Computing Machinery, Inc

ER -