Providing route directions: Design of robot's utterance, gesture, and timing

Yusuke Okuno, Takayuki Kanda, Michita Imai, Hiroshi Ishiguro, Norihiro Hagita

Research output: Chapter in Book/Report/Conference proceedingConference contribution

43 Citations (Scopus)

Abstract

Providing route directions is a complicated interaction. Utterances are combined with gestures and pronounced with appropriate timing. This study proposes a model for a robot that generates route directions by integrating three important crucial elements: Utterances, gestures, and timing. Two research questions must be answered in this modeling process. First, is it useful to let robot perform gesture even though the information conveyed by the gesture is given by utterance as well? Second, is it useful to implement the timing at which humans speaks? Many previous studies about the natural behavior of computers and robots have learned from human speakers, such as gestures and speech timing. However, our approach is different from such previous studies. We emphasized the listener's perspective. Gestures were designed based on the usefulness, although we were influenced by the basic structure of human gestures. Timing was not based on how humans speak, but modeled from how they listen. The experimental result demonstrated the effectiveness of our approach, not only for task efficiency but also for perceived naturalness.

Original languageEnglish
Title of host publicationProceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09
Pages53-60
Number of pages8
DOIs
Publication statusPublished - 2008
Externally publishedYes
Event4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09 - San Diego, CA, United States
Duration: 2009 Mar 112009 Mar 13

Other

Other4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09
CountryUnited States
CitySan Diego, CA
Period09/3/1109/3/13

Fingerprint

Robots

Keywords

  • Gesture
  • Route directions
  • Timing

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Cite this

Okuno, Y., Kanda, T., Imai, M., Ishiguro, H., & Hagita, N. (2008). Providing route directions: Design of robot's utterance, gesture, and timing. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09 (pp. 53-60) https://doi.org/10.1145/1514095.1514108

Providing route directions : Design of robot's utterance, gesture, and timing. / Okuno, Yusuke; Kanda, Takayuki; Imai, Michita; Ishiguro, Hiroshi; Hagita, Norihiro.

Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09. 2008. p. 53-60.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Okuno, Y, Kanda, T, Imai, M, Ishiguro, H & Hagita, N 2008, Providing route directions: Design of robot's utterance, gesture, and timing. in Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09. pp. 53-60, 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09, San Diego, CA, United States, 09/3/11. https://doi.org/10.1145/1514095.1514108
Okuno Y, Kanda T, Imai M, Ishiguro H, Hagita N. Providing route directions: Design of robot's utterance, gesture, and timing. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09. 2008. p. 53-60 https://doi.org/10.1145/1514095.1514108
Okuno, Yusuke ; Kanda, Takayuki ; Imai, Michita ; Ishiguro, Hiroshi ; Hagita, Norihiro. / Providing route directions : Design of robot's utterance, gesture, and timing. Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09. 2008. pp. 53-60
@inproceedings{6c47cc613f034ab8acd0b18978a1a11c,
title = "Providing route directions: Design of robot's utterance, gesture, and timing",
abstract = "Providing route directions is a complicated interaction. Utterances are combined with gestures and pronounced with appropriate timing. This study proposes a model for a robot that generates route directions by integrating three important crucial elements: Utterances, gestures, and timing. Two research questions must be answered in this modeling process. First, is it useful to let robot perform gesture even though the information conveyed by the gesture is given by utterance as well? Second, is it useful to implement the timing at which humans speaks? Many previous studies about the natural behavior of computers and robots have learned from human speakers, such as gestures and speech timing. However, our approach is different from such previous studies. We emphasized the listener's perspective. Gestures were designed based on the usefulness, although we were influenced by the basic structure of human gestures. Timing was not based on how humans speak, but modeled from how they listen. The experimental result demonstrated the effectiveness of our approach, not only for task efficiency but also for perceived naturalness.",
keywords = "Gesture, Route directions, Timing",
author = "Yusuke Okuno and Takayuki Kanda and Michita Imai and Hiroshi Ishiguro and Norihiro Hagita",
year = "2008",
doi = "10.1145/1514095.1514108",
language = "English",
isbn = "9781605584041",
pages = "53--60",
booktitle = "Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09",

}

TY - GEN

T1 - Providing route directions

T2 - Design of robot's utterance, gesture, and timing

AU - Okuno, Yusuke

AU - Kanda, Takayuki

AU - Imai, Michita

AU - Ishiguro, Hiroshi

AU - Hagita, Norihiro

PY - 2008

Y1 - 2008

N2 - Providing route directions is a complicated interaction. Utterances are combined with gestures and pronounced with appropriate timing. This study proposes a model for a robot that generates route directions by integrating three important crucial elements: Utterances, gestures, and timing. Two research questions must be answered in this modeling process. First, is it useful to let robot perform gesture even though the information conveyed by the gesture is given by utterance as well? Second, is it useful to implement the timing at which humans speaks? Many previous studies about the natural behavior of computers and robots have learned from human speakers, such as gestures and speech timing. However, our approach is different from such previous studies. We emphasized the listener's perspective. Gestures were designed based on the usefulness, although we were influenced by the basic structure of human gestures. Timing was not based on how humans speak, but modeled from how they listen. The experimental result demonstrated the effectiveness of our approach, not only for task efficiency but also for perceived naturalness.

AB - Providing route directions is a complicated interaction. Utterances are combined with gestures and pronounced with appropriate timing. This study proposes a model for a robot that generates route directions by integrating three important crucial elements: Utterances, gestures, and timing. Two research questions must be answered in this modeling process. First, is it useful to let robot perform gesture even though the information conveyed by the gesture is given by utterance as well? Second, is it useful to implement the timing at which humans speaks? Many previous studies about the natural behavior of computers and robots have learned from human speakers, such as gestures and speech timing. However, our approach is different from such previous studies. We emphasized the listener's perspective. Gestures were designed based on the usefulness, although we were influenced by the basic structure of human gestures. Timing was not based on how humans speak, but modeled from how they listen. The experimental result demonstrated the effectiveness of our approach, not only for task efficiency but also for perceived naturalness.

KW - Gesture

KW - Route directions

KW - Timing

UR - http://www.scopus.com/inward/record.url?scp=67650688965&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=67650688965&partnerID=8YFLogxK

U2 - 10.1145/1514095.1514108

DO - 10.1145/1514095.1514108

M3 - Conference contribution

AN - SCOPUS:67650688965

SN - 9781605584041

SP - 53

EP - 60

BT - Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI'09

ER -