In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.
|出版ステータス||Published - 2001 12月 1|
|イベント||10th IEEE International Workshop on Robot and Human Communication - Bordeaux-Paris, France|
継続期間: 2001 9月 18 → 2001 9月 21
|Other||10th IEEE International Workshop on Robot and Human Communication|
|Period||01/9/18 → 01/9/21|
ASJC Scopus subject areas