Unconstrained and Calibration-Free Gaze Estimation in a Room-Scale Area Using a Monocular Camera

Kimimasa Tamura, Ran Choi, Yoshimitsu Aoki

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)


Gaze estimation using monocular cameras has significant commercial applicability, and many studies have been undertaken on head pose-invariant and calibration-free gaze estimation. The head positions in existing data sets used in these studies are, however, limited to the vicinity of the camera, and methods trained on such data sets are not applicable when subjects are at greater distances from the camera. In this paper, we create a room-scale gaze data set with large variations in head poses to achieve robust gaze estimation across a broader range of widths and depths. The head positions are much farther from the camera, and the resolution of the eye image is lower than in conventional data sets. To address this issue, we propose a likelihood evaluation method based on edge gradients with dense particles for iris tracking, which achieves robust tracking at low-resolution eye images. Cross-validation experiments show that our proposed method is more accurate than conventional methods on all the individuals in our data set.

Original languageEnglish
Pages (from-to)10896-10908
Number of pages13
JournalIEEE Access
Publication statusPublished - 2017 Aug 5


  • Gaze estimation
  • iris tracking
  • particle filter
  • regression

ASJC Scopus subject areas

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)


Dive into the research topics of 'Unconstrained and Calibration-Free Gaze Estimation in a Room-Scale Area Using a Monocular Camera'. Together they form a unique fingerprint.

Cite this