Driver state monitoring is one of the key technologies for implementation of driver assistance systems. Drivers depend mainly on visual information as the basis of operating their vehicle. Therefore, detecting the driver's head pose and gaze direction would be an effective way to estimate the presence/absence of driving errors. This study focuses on using 3D movements of the head pose as the driver's state and proposes a method where the head pose is estimated from stereo images produced by multiple cameras installed around the dashboard in front of the driver's seat. We focus on the area around nostrils and estimate its 3D position from face images. Because, the camera installation location is restricted to the area around the dashboard and thus the face is photographed from the lower left/right angle, making the nostrils the most stable feature in this situation. First we search the nostril position using template matching and reconstruct its 3D position. Then we preciously estimate the 3D head pose by particle filtering with 3D model-based method. Fixation of the nostril 3D position is added to the motion model used for predicting the particle hypothesis. By reducing the size of the state space in this way and thus shortening the sampling interval, we are able to track the head pose with a high level of accuracy and at high speed even when the number of particles is small. In addition, we estimate the gaze direction by the vector between measured pupil center and estimated eyeball center. The 3D position of the pupil center is measured by stereo measurement using 2D coordinates extracted from each image. The 3D position of eyeball center is estimated with transformation from the initial position of eyeball model by the result of head pose estimation.
|出版ステータス||Published - 2008 12月 1|
|イベント||32nd FISITA World Automotive Congress 2008 - Munich, Germany|
継続期間: 2008 9月 14 → 2008 9月 19
|Other||32nd FISITA World Automotive Congress 2008|
|Period||08/9/14 → 08/9/19|
ASJC Scopus subject areas