Title
Developing an eye-tracking algorithm as a potential tool for early diagnosis of autism spectrum disorder in children
Date Issued
2017
Access level
open access
Resource Type
journal article
Author(s)
Vargas-Cuentas, NI
Roman-Gonzalez, A
Barrientos, F
Ting, J
Hidalgo, D
Jensen, K
Publisher(s)
Public Library of Science
Abstract
Background: Autism spectrum disorder (ASD) currently affects nearly 1 in 160 children worldwide. In over two-thirds of evaluations, no validated diagnostics are used and gold standard diagnostic tools are used in less than 5% of evaluations. Currently, the diagnosis of ASD requires lengthy and expensive tests, in addition to clinical confirmation. Therefore, fast, cheap, portable, and easy-to-administer screening instruments for ASD are required. Several studies have shown that children with ASD have a lower preference for social scenes compared with children without ASD. Based on this, eye-tracking and measurement of gaze preference for social scenes has been used as a screening tool for ASD. Currently available eye-tracking software requires intensive calibration, training, or holding of the head to prevent interference with gaze recognition limiting its use in children with ASD. Methods: In this study, we designed a simple eye-tracking algorithm that does not require calibration or head holding, as a platform for future validation of a cost-effective ASD potential screening instrument. This system operates on a portable and inexpensive tablet to measure gaze preference of children for social compared to abstract scenes. A child watches a one-minute stimulus video composed of a social scene projected on the left side and an abstract scene projected on the right side of the tablet’s screen. We designed five stimulus videos by changing the social/abstract scenes. Every child observed all the five videos in random order. We developed an eye-tracking algorithm that calculates the child’s gaze preference for the social and abstract scenes, estimated as the percentage of the accumulated time that the child observes the left or right side of the screen, respectively. Twenty-three children without a prior history of ASD and 8 children with a clinical diagnosis of ASD were evaluated. The recorded video of the child´s eye movement was analyzed both manually by an observer and automatically by our algorithm. Results: This study demonstrates that the algorithm correctly differentiates visual preference for either the left or right side of the screen (social or abstract scenes), identifies distractions, and maintains high accuracy compared to the manual classification. The error of the algorithm was 1.52%, when compared to the gold standard of manual observation. Discussion: This tablet-based gaze preference/eye-tracking algorithm can estimate gaze preference in both children with ASD and without ASD to a high degree of accuracy, without the need for calibration, training, or restraint of the children. This system can be utilized in low-resource settings as a portable and cost-effective potential screening tool for ASD. © 2017 Vargas-Cuentas et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Volume
12
Issue
11
Number
28
Language
English
Scopus EID
2-s2.0-85036621016
PubMed ID
Source
PLoS ONE
ISSN of the container
1932-6203
Sponsor(s)
INNOVATE Peru, Contrato N 153 PNICP-PIAP-2015. CIENCIACTIVA, 01-2013-Fondecyt ‘Financiamiento de Subvención para Investigación Postdoctoral’. Additional funding was provided by the Wendy Klag grant from Johns Hopkins University and by the National Institutes of Health Office of the Director, Fogarty International Center, Office of AIDS Research, National Cancer Center, National Heart, Blood, and Lung Institute, and the NIH Office of Research for Women’s Health through the Fogarty Global Health Fellows Program Consortium comprised of the University of North Carolina, John Hopkins, Morehouse and Tulane (R25TW009340). MZ was a grantee of Bill and Melinda Gates Foundation (OPPOPP1140557). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. We want to thank the members of the lab who supported us throughout the process and allowed their children to be recorded for the initial step of proof of concept.
Sources of information:
Directorio de Producción Científica