Where are you looking at? - Feature-based eye tracking on unmodified tablets

Shoya Ishimaru, Kai Steven Kunze, Yuzuko Utsumi, Masakazu Iwamura, Koichi Kise

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This paper introduces our work towards implementing eye tracking on commodity devices. We describe our feature-based approach and the eye tracking system working on a commodity tablet. We recorded the data of 5 subjects following an animation on screen as reference. On the assumption that the position of device and user's head is stable, the average distance error between estimated gaze point to actual gaze point is around 12.23 [mm] using user-dependent training.

Original languageEnglish
Title of host publicationProceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013
PublisherIEEE Computer Society
Pages738-739
Number of pages2
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013 - Naha, Okinawa, Japan
Duration: 2013 Nov 52013 Nov 8

Other

Other2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013
CountryJapan
CityNaha, Okinawa
Period13/11/513/11/8

Fingerprint

Animation

Keywords

  • Eye gaze
  • Eyetracking
  • IOS
  • IPad
  • Reading
  • Tablet

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Cite this

Ishimaru, S., Kunze, K. S., Utsumi, Y., Iwamura, M., & Kise, K. (2013). Where are you looking at? - Feature-based eye tracking on unmodified tablets. In Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013 (pp. 738-739). [6778418] IEEE Computer Society. https://doi.org/10.1109/ACPR.2013.190

Where are you looking at? - Feature-based eye tracking on unmodified tablets. / Ishimaru, Shoya; Kunze, Kai Steven; Utsumi, Yuzuko; Iwamura, Masakazu; Kise, Koichi.

Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013. IEEE Computer Society, 2013. p. 738-739 6778418.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ishimaru, S, Kunze, KS, Utsumi, Y, Iwamura, M & Kise, K 2013, Where are you looking at? - Feature-based eye tracking on unmodified tablets. in Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013., 6778418, IEEE Computer Society, pp. 738-739, 2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013, Naha, Okinawa, Japan, 13/11/5. https://doi.org/10.1109/ACPR.2013.190
Ishimaru S, Kunze KS, Utsumi Y, Iwamura M, Kise K. Where are you looking at? - Feature-based eye tracking on unmodified tablets. In Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013. IEEE Computer Society. 2013. p. 738-739. 6778418 https://doi.org/10.1109/ACPR.2013.190
Ishimaru, Shoya ; Kunze, Kai Steven ; Utsumi, Yuzuko ; Iwamura, Masakazu ; Kise, Koichi. / Where are you looking at? - Feature-based eye tracking on unmodified tablets. Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013. IEEE Computer Society, 2013. pp. 738-739
@inproceedings{7171174261194274ba752e108ba4f1c3,
title = "Where are you looking at? - Feature-based eye tracking on unmodified tablets",
abstract = "This paper introduces our work towards implementing eye tracking on commodity devices. We describe our feature-based approach and the eye tracking system working on a commodity tablet. We recorded the data of 5 subjects following an animation on screen as reference. On the assumption that the position of device and user's head is stable, the average distance error between estimated gaze point to actual gaze point is around 12.23 [mm] using user-dependent training.",
keywords = "Eye gaze, Eyetracking, IOS, IPad, Reading, Tablet",
author = "Shoya Ishimaru and Kunze, {Kai Steven} and Yuzuko Utsumi and Masakazu Iwamura and Koichi Kise",
year = "2013",
doi = "10.1109/ACPR.2013.190",
language = "English",
pages = "738--739",
booktitle = "Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013",
publisher = "IEEE Computer Society",

}

TY - GEN

T1 - Where are you looking at? - Feature-based eye tracking on unmodified tablets

AU - Ishimaru, Shoya

AU - Kunze, Kai Steven

AU - Utsumi, Yuzuko

AU - Iwamura, Masakazu

AU - Kise, Koichi

PY - 2013

Y1 - 2013

N2 - This paper introduces our work towards implementing eye tracking on commodity devices. We describe our feature-based approach and the eye tracking system working on a commodity tablet. We recorded the data of 5 subjects following an animation on screen as reference. On the assumption that the position of device and user's head is stable, the average distance error between estimated gaze point to actual gaze point is around 12.23 [mm] using user-dependent training.

AB - This paper introduces our work towards implementing eye tracking on commodity devices. We describe our feature-based approach and the eye tracking system working on a commodity tablet. We recorded the data of 5 subjects following an animation on screen as reference. On the assumption that the position of device and user's head is stable, the average distance error between estimated gaze point to actual gaze point is around 12.23 [mm] using user-dependent training.

KW - Eye gaze

KW - Eyetracking

KW - IOS

KW - IPad

KW - Reading

KW - Tablet

UR - http://www.scopus.com/inward/record.url?scp=84899108835&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84899108835&partnerID=8YFLogxK

U2 - 10.1109/ACPR.2013.190

DO - 10.1109/ACPR.2013.190

M3 - Conference contribution

AN - SCOPUS:84899108835

SP - 738

EP - 739

BT - Proceedings - 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013

PB - IEEE Computer Society

ER -