Unconstrained and Calibration-free Gaze Estimation in a Room-scale Area using a Monocular Camera

Kimimasa Tamura, Ran Choi, Yoshimitsu Aoki

    Research output: Contribution to journalArticle

    4 Citations (Scopus)


    Gaze estimation using monocular cameras has high industrial application value and many studies have been undertaken on head pose-invariant and calibration-free gaze estimation. Head positions in existing datasets used in these studies are, however, limited to the vicinity of the camera and methods trained on such datasets are not applicable when subjects are distant from the camera. In this study, we create a room-scale gaze dataset with largely varied head poses to achieve a robust gaze estimation in broader range of width and depths. Head positions are much farther and the resolution of eye image is smaller than that in conventional datasets. To address this issue, we propose a likelihood evaluation method based on edge gradient with dense particles for iris tracking to achieve robust tracking at low resolution eye image. Our proposed method has been proven more accurate than conventional methods on all the individuals in our dataset through several experiments with cross-validation.

    Original languageEnglish
    JournalIEEE Access
    Publication statusAccepted/In press - 2017 Aug 5


    • Gaze Estimation
    • Iris Tracking
    • Particle Filter
    • Regression

    ASJC Scopus subject areas

    • Computer Science(all)
    • Materials Science(all)
    • Engineering(all)

    Fingerprint Dive into the research topics of 'Unconstrained and Calibration-free Gaze Estimation in a Room-scale Area using a Monocular Camera'. Together they form a unique fingerprint.

  • Cite this