A robust gesture recognition based on depth data

Lee Jaemin, Hironori Takimoto, Hitoshi Yamauchi, Akihiro Kanazawa, Yasue Mitsukura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Citations (Scopus)

Abstract

In this paper, we propose a novel method for gesture recognition using depth data captured by Microsoft Kinect sensor. Conventionally, the features which have been used for gesture recognition are divided into two parts, hand shape and arm movement. In conventional methods, only two-dimensional hand features are used because human's hand consists of the multiple joint structure. Furthermore, conventional arm movement feature are influenced by environmental changing, such as individual differences in body size, camera position and so on. Therefore, to assist in the recognition, a method of feature extraction is proposed, which involves the hand shape with 3D feature and the arm movement with angle between joints of body. In order to show effectiveness of the proposed method, performance for gesture recognition is compared with conventional methods using Japanese language.

Original languageEnglish
Title of host publicationFCV 2013 - Proceedings of the 19th Korea-Japan Joint Workshop on Frontiers of Computer Vision
Pages127-131
Number of pages5
DOIs
Publication statusPublished - 2013 Apr 15
Event19th Korea-Japan Joint Workshop on Frontiers of Computer Vision, FCV 2013 - Incheon, Korea, Republic of
Duration: 2013 Jan 302013 Feb 1

Publication series

NameFCV 2013 - Proceedings of the 19th Korea-Japan Joint Workshop on Frontiers of Computer Vision

Other

Other19th Korea-Japan Joint Workshop on Frontiers of Computer Vision, FCV 2013
Country/TerritoryKorea, Republic of
CityIncheon
Period13/1/3013/2/1

Keywords

  • HMM
  • Image processing
  • depth sensor
  • hand geesture recognition

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'A robust gesture recognition based on depth data'. Together they form a unique fingerprint.

Cite this