Title
A robust gesture recognition using hand local data and skeleton trajectory
Date Issued
09 December 2015
Access level
metadata only access
Resource Type
conference paper
Author(s)
Universidad Federal de Ouro Preto
Universidad Federal de Ouro Preto
Publisher(s)
IEEE Computer Society
Abstract
In this paper, we propose a new approach for dynamic hand gesture recognition using intensity, depth and skeleton joint data captured by KinectTM sensor. The proposed approach integrates global and local information of a dynamic gesture. First, we represent the skeleton 3D trajectory in spherical coordinates. Then, we extract the key frames corresponding to the points with more angular and distance difference. In each key frame, we calculate the spherical distance from the hands, wrists and elbows to the shoulder center, also we record the hands position changes to obtain the global information. Finally, we segment the hands and use SIFT descriptor on intensity and depth data. Then, Bag of Visual Words (BOW) approach is used to extract local information. The system was tested with the ChaLearn 2013 gesture dataset and our own Brazilian Sign Language dataset, achieving an accuracy of 88.39% and 98.28%, respectively.
Start page
1240
End page
1244
Volume
2015-December
Language
English
OCDE Knowledge area
Sistemas de automatización, Sistemas de control
Scopus EID
2-s2.0-84956644385
ISBN
9781479983391
Source
Proceedings - International Conference on Image Processing, ICIP
Resource of which it is part
Proceedings - International Conference on Image Processing, ICIP
ISSN of the container
15224880
ISBN of the container
978-147998339-1
Conference
IEEE International Conference on Image Processing, ICIP 2015
Sources of information: Directorio de Producción Científica Scopus