Where are you looking at? - Feature-based eye tracking on unmodified tablets

Shoya Ishimaru, Kai Kunze, Yuzuko Utsumi, Masakazu Iwamura, Koichi Kise

研究成果: Paper

5 引用 (Scopus)

抜粋

This paper introduces our work towards implementing eye tracking on commodity devices. We describe our feature-based approach and the eye tracking system working on a commodity tablet. We recorded the data of 5 subjects following an animation on screen as reference. On the assumption that the position of device and user's head is stable, the average distance error between estimated gaze point to actual gaze point is around 12.23 [mm] using user-dependent training.

元の言語English
ページ738-739
ページ数2
DOI
出版物ステータスPublished - 2013
外部発表Yes
イベント2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013 - Naha, Okinawa, Japan
継続期間: 2013 11 52013 11 8

Other

Other2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013
Japan
Naha, Okinawa
期間13/11/513/11/8

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

フィンガープリント Where are you looking at? - Feature-based eye tracking on unmodified tablets' の研究トピックを掘り下げます。これらはともに一意のフィンガープリントを構成します。

  • これを引用

    Ishimaru, S., Kunze, K., Utsumi, Y., Iwamura, M., & Kise, K. (2013). Where are you looking at? - Feature-based eye tracking on unmodified tablets. 738-739. 論文発表場所 2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013, Naha, Okinawa, Japan. https://doi.org/10.1109/ACPR.2013.190