Is a Robot a Better Walking Partner if It Associates Utterances with Visual Scenes?

Ryusuke Totsuka, Satoru Satake, Takayuki Kanda, Michita Imai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

We aim to develop a walking partner robot with the capability to select small-talk topics that are associative to visual scenes. We first collected video sequences from five different locations and prepared a dataset about small-talk topics associated to visual scenes. Then we developed a technique to associate the visual scenes with the small-talk topics. We converted visual scenes into lists of words using an off-the-shelf vision library and formed a topic space with a Latent Dirichlet Allocation (LDA) method in which a list of words is transformed to a topic vector. Finally, the system selects the most similar utterance in the topic vectors. We tested our developed technique with a dataset, which successfully selected 72% appropriate utterances, and conducted a user study outdoors where participants took a walk with a small robot on their shoulder and engaged in small talk. We confirmed that the participants more highly perceived the robot with our developed technique because it selected appropriate utterances than a robot that randomly selected utterances. Further, they also felt that the former type of robot is a better walking partner.

Original languageEnglish
Title of host publicationHRI 2017 - Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction
PublisherIEEE Computer Society
Pages313-322
Number of pages10
ISBN (Electronic)9781450343367
DOIs
Publication statusPublished - 2017 Mar 6
Event12th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2017 - Vienna, Austria
Duration: 2017 Mar 62017 Mar 9

Publication series

NameACM/IEEE International Conference on Human-Robot Interaction
VolumePart F127194
ISSN (Electronic)2167-2148

Other

Other12th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2017
Country/TerritoryAustria
CityVienna
Period17/3/617/3/9

Keywords

  • association of utterance and visual scene
  • walking partner robot

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Is a Robot a Better Walking Partner if It Associates Utterances with Visual Scenes?'. Together they form a unique fingerprint.

Cite this