COSPI: Contextual sensors for human-robot interaction

Kentaro Ishii, Michita Imai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes a system named COSPI for communication robot to process data from sensors in the environment. In order to realize a smooth interaction between a person and a robot, the robot needs to recognize objects around itself including people. However, since the information acquired from sensors on the robot is poor and limited to local information of the robot, it is difficult to realize a smooth interaction relying only on the information from sensors on the robot. In this research, we set sensors in an environment to obtain wide and rich environmental data from sensors around the robot. In case of a communication robot, since the robot requires to deal with several tasks in parallel, the robot needs to deal with the information from sensors simply to realize a smooth interaction. COSPI changes data sent to robots by its internal state. Robots can simply react on data from COSPI. In this paper, we explain how COSPI's state is determined and how COSPI selects data from sensors.

Original languageEnglish
Title of host publicationProceedings of the IASTED International Conference on Robotics and Applications
EditorsM. Kamel
Pages136-141
Number of pages6
Publication statusPublished - 2004
EventProceedings of the Tenth IASTED International Conference on Robotics and Applications - Honolulu, HI, United States
Duration: 2004 Aug 232004 Aug 25

Other

OtherProceedings of the Tenth IASTED International Conference on Robotics and Applications
CountryUnited States
CityHonolulu, HI
Period04/8/2304/8/25

    Fingerprint

Keywords

  • Communicating Context
  • Human-Robot Interaction
  • Sensor Network
  • Space Recognition
  • Visual Sensor

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Ishii, K., & Imai, M. (2004). COSPI: Contextual sensors for human-robot interaction. In M. Kamel (Ed.), Proceedings of the IASTED International Conference on Robotics and Applications (pp. 136-141)