Real-time vision system for autonomous mobile robot

Masataka Doi, Manabu Nakakita, Yoshimitsu Aoki, Shuji Hashimoto

Research output: Contribution to conferencePaper

8 Citations (Scopus)

Abstract

In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

Original languageEnglish
Pages442-449
Number of pages8
Publication statusPublished - 2001 Dec 1
Externally publishedYes
Event10th IEEE International Workshop on Robot and Human Communication - Bordeaux-Paris, France
Duration: 2001 Sep 182001 Sep 21

Other

Other10th IEEE International Workshop on Robot and Human Communication
CountryFrance
CityBordeaux-Paris
Period01/9/1801/9/21

    Fingerprint

ASJC Scopus subject areas

  • Hardware and Architecture
  • Software

Cite this

Doi, M., Nakakita, M., Aoki, Y., & Hashimoto, S. (2001). Real-time vision system for autonomous mobile robot. 442-449. Paper presented at 10th IEEE International Workshop on Robot and Human Communication, Bordeaux-Paris, France.