Eye gaze as a biomarker in the recognition of autism spectrum disorder using virtual reality and machine learning: A proof of concept for diagnosis.
A short VR eye-tracking game spotted autism in a large share of kids, giving BCBAs a possible tech-aided screener.
01Research in Context
What this study did
The team put a VR headset on the kids. Half had autism, half were typical.
While the kids watched 3-D social scenes, the headset tracked where their eyes moved.
A computer program then looked for gaze patterns that only appeared in the autism group.
What they found
The program correctly picked out a large share of the autistic kids. It only missed a large share.
Eye-gaze features beat chance by a wide margin. The authors call it a proof-of-concept biomarker.
How this fits with other research
Cheng et al. (2012) did an earlier VR lab study. They also watched autistic kids’ eyes, but to teach joint attention, not to diagnose.
Simões et al. (2020) used the same VR-plus-eye-track setup. They found autistic adults stand farther from avatars. Together the papers show VR eye data can flag both social distance and diagnosis risk.
Sirao et al. (2026) used machine learning on brain data, not eyes, and still separated autistic from typical preschoolers. Different signal, same end goal—objective classification.
Why it matters
You can’t strap an fNIRS cap on every toddler, but a 10-minute VR game is doable. If larger trials hold up, clinics could screen with a headset instead of a long interview. Until then, try quick VR social scenes during intake and watch where the child looks—it may give you an extra data point for your assessment battery.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Run a 5-minute VR bakery scene, note gaze shifts between faces and objects, and file the pattern with your other assessment data.
02At a glance
03Original abstract
The core symptoms of autism spectrum disorder (ASD) mainly relate to social communication and interactions. ASD assessment involves expert observations in neutral settings, which introduces limitations and biases related to lack of objectivity and does not capture performance in real-world settings. To overcome these limitations, advances in technologies (e.g., virtual reality) and sensors (e.g., eye-tracking tools) have been used to create realistic simulated environments and track eye movements, enriching assessments with more objective data than can be obtained via traditional measures. This study aimed to distinguish between autistic and typically developing children using visual attention behaviors through an eye-tracking paradigm in a virtual environment as a measure of attunement to and extraction of socially relevant information. The 55 children participated. Autistic children presented a higher number of frames, both overall and per scenario, and showed higher visual preferences for adults over children, as well as specific preferences for adults' rather than children's faces on which looked more at bodies. A set of multivariate supervised machine learning models were developed using recursive feature selection to recognize ASD based on extracted eye gaze features. The models achieved up to 86% accuracy (sensitivity = 91%) in recognizing autistic children. Our results should be taken as preliminary due to the relatively small sample size and the lack of an external replication dataset. However, to our knowledge, this constitutes a first proof of concept in the combined use of virtual reality, eye-tracking tools, and machine learning for ASD recognition. LAY SUMMARY: Core symptoms in children with ASD involve social communication and interaction. ASD assessment includes expert observations in neutral settings, which show limitations and biases related to lack of objectivity and do not capture performance in real settings. To overcome these limitations, this work aimed to distinguish between autistic and typically developing children in visual attention behaviors through an eye-tracking paradigm in a virtual environment as a measure of attunement to, and extraction of, socially relevant information.
Autism research : official journal of the International Society for Autism Research, 2022 · doi:10.1002/aur.2636