Analysis and Recognition of a Human Head's Movement from an Image Sequence

Yuichi Abe, Masafumi Hagiwara

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In this paper, we propose a method to analyze and recognize human head movements from monocular image sequences based on few constraints. First, the background image photographed beforehand is used to extract the human region from the image sequence. Next, the skin-colored regions are extracted, and the low brightness and low saturation pixels are set as the candidate eye regions. Then the face region is determined from the multiple skin-colored regions and the hair region is extracted based on the face region. Finally, the left and right eye regions are determined from the candidate eye regions. Thus, effective information for posture estimation such as the eye positions, face region, and hair region is extracted from the input image. A genetic algorithm (GA) is applied to fit the extracted information to a 3D head model and estimate the head posture. By creating an approximate 3D head model, the system adapts to an indeterminate large number of humans. Since an image sequence is input, the information in the previous image can be used. Thus, the possible human movements and the angles of the joints are a priori knowledge and are reflected in the parameters. The lack of information for the 3D posture estimation problem from a monocular image sequence is alleviated.

Original languageEnglish
Pages (from-to)36-45
Number of pages10
JournalSystems and Computers in Japan
Volume32
Issue number5
DOIs
Publication statusPublished - 2001 May

Keywords

  • Genetic algorithms
  • Head motion
  • Image recognition
  • Monocular view

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Hardware and Architecture
  • Computational Theory and Mathematics

Fingerprint Dive into the research topics of 'Analysis and Recognition of a Human Head's Movement from an Image Sequence'. Together they form a unique fingerprint.

  • Cite this