Title
Data collection of 3D spatial features of gestures from static peruvian sign language alphabet for sign language recognition
Date Issued
21 October 2020
Access level
metadata only access
Resource Type
conference paper
Author(s)
Nurena-Jara R.
Ramos-Carrion C.
Shiguihara-Juarez P.
Publisher(s)
Institute of Electrical and Electronics Engineers Inc.
Abstract
Peruvian Sign Language Recognition (PSL) is approached as a classification problem. Previous work has employed 2D features from the position of hands to tackle this problem. In this paper, we propose a method to construct a dataset consisting of 3D spatial positions of static gestures from the PSL alphabet, using the HTC Vive device and a well-known technique to extract 21 keypoints from the hand to obtain a feature vector. A dataset of 35, 400 instances of gestures for PSL was constructed and a novel way to extract data was stated. To validate the appropriateness of this dataset, a comparison of four baselines classifiers in the Peruvian Sign Language Recognition (PSLR) task was stated, achieving 99.32% in the average in terms of F1 measure in the best case.
Language
English
OCDE Knowledge area
Otras ingenierías y tecnologías
Scopus EID
2-s2.0-85097807930
Resource of which it is part
Proceedings of the 2020 IEEE Engineering International Research Conference, EIRCON 2020
ISBN of the container
978-172818367-1
Conference
2020 IEEE Engineering International Research Conference, EIRCON 2020Lima21 October 2020through 23 October 2020
Sources of information: Directorio de Producción Científica Scopus