抄録
In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.
本文言語 | English |
---|---|
ページ | 442-449 |
ページ数 | 8 |
出版ステータス | Published - 2001 12 1 |
外部発表 | はい |
イベント | 10th IEEE International Workshop on Robot and Human Communication - Bordeaux-Paris, France 継続期間: 2001 9 18 → 2001 9 21 |
Other
Other | 10th IEEE International Workshop on Robot and Human Communication |
---|---|
Country | France |
City | Bordeaux-Paris |
Period | 01/9/18 → 01/9/21 |
ASJC Scopus subject areas
- Hardware and Architecture
- Software