PredGaze: A Incongruity Prediction Model for User's Gaze Movement

Yohei Otsuka, Shohei Akita, Kohei Okuoka, Mitsuhiko Kimoto, Michita Imai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

With digital signage and communication robots, digital agents have gradually become popular and will become more popular. It is important to make humans notice the intentions of agents throughout the interaction between them. This paper is focused on the gaze behavior of an agent and the phenomenon that if the gaze behavior of an agent is different from human expectations, human will have a incongruity and feel the existence of the agent's intention behind the behavioral changes instinctively. We propose PredGaze, a model of estimating this incongruity which humans have according to the shift in gaze behavior from the human's expectations. In particular, PredGaze uses the variance in the agent behavior model to express how well humans sense the behavioral tendency of the agent. We expect that this variance will improve the estimation of the incongruity. PredGaze uses three variables to estimate the internal state of how much a human senses the agent's intention: error, confidence, and incongruity. To evaluate the effectiveness of PredGaze with these three variables, we conducted an experiment to investigate the effects of the timing of gaze behavior change and incongruity. The experimental results indicated that there were significant differences in the subjective scores of the naturalness of agents and incongruity with agents according to the difference in the timing of the agent's change in its gaze behavior.

Original languageEnglish
Title of host publication29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages48-53
Number of pages6
ISBN (Electronic)9781728160757
DOIs
Publication statusPublished - 2020 Aug
Event29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020 - Virtual, Naples, Italy
Duration: 2020 Aug 312020 Sep 4

Publication series

Name29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020

Conference

Conference29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020
CountryItaly
CityVirtual, Naples
Period20/8/3120/9/4

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Social Psychology
  • Communication

Fingerprint Dive into the research topics of 'PredGaze: A Incongruity Prediction Model for User's Gaze Movement'. Together they form a unique fingerprint.

Cite this