Where are you looking at? - Feature-based eye tracking on unmodified tablets

Shoya Ishimaru, Kai Kunze, Yuzuko Utsumi, Masakazu Iwamura, Koichi Kise

Research output: Contribution to conferencePaper

5 Citations (Scopus)

Abstract

This paper introduces our work towards implementing eye tracking on commodity devices. We describe our feature-based approach and the eye tracking system working on a commodity tablet. We recorded the data of 5 subjects following an animation on screen as reference. On the assumption that the position of device and user's head is stable, the average distance error between estimated gaze point to actual gaze point is around 12.23 [mm] using user-dependent training.

Original languageEnglish
Pages738-739
Number of pages2
DOIs
Publication statusPublished - 2013 Jan 1
Externally publishedYes
Event2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013 - Naha, Okinawa, Japan
Duration: 2013 Nov 52013 Nov 8

Other

Other2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013
CountryJapan
CityNaha, Okinawa
Period13/11/513/11/8

Keywords

  • Eye gaze
  • Eyetracking
  • IOS
  • IPad
  • Reading
  • Tablet

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Where are you looking at? - Feature-based eye tracking on unmodified tablets'. Together they form a unique fingerprint.

  • Cite this

    Ishimaru, S., Kunze, K., Utsumi, Y., Iwamura, M., & Kise, K. (2013). Where are you looking at? - Feature-based eye tracking on unmodified tablets. 738-739. Paper presented at 2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013, Naha, Okinawa, Japan. https://doi.org/10.1109/ACPR.2013.190