Assessment & Research

Seeing through a robot's eyes: A cross-sectional exploratory study in developing a robotic screening technology for autism.

So et al. (2024) · Autism research : official journal of the International Society for Autism Research 2024
★ The Verdict

A small robot that tracks eye contact can screen for autism with the same accuracy as expert clinicians.

✓ Read this if BCBAs who run intake clinics or preschool screening days.
✗ Skip if Clinicians who only serve adolescents and adults.

01Research in Context

01

What this study did

So et al. (2024) built a small robot called HUMANE. The robot watches a child’s eyes while they play. It counts how often the child needs a prompt to look and how long the child looks away.

The team tested the robot with children with autism and neurotypical children. They wanted to know if the robot could screen for autism as well as a human clinician.

02

What they found

The robot reached high reliability and correctly flagged most children with autism. Sensitivity and specificity were both above 0.88. The diagnostic odds ratio topped 190. In plain words, the robot rarely missed autism and rarely called a typical child autistic.

03

How this fits with other research

Kumazaki et al. (2019) tried a two-robot setup five years earlier. Their robots gave social communication scores that matched ADOS. So et al. (2024) move the field forward by using eye-gaze instead of social talk, and they hit stronger numbers.

Rojahn et al. (2012) warned that robot tools for autism were still “exploratory” and weak. The new study answers that call by showing solid, publishable psychometrics. It does not contradict the old warning; it simply shows the field has grown up.

Zhou et al. (2025) meta-analysis found medium effects for XR autism tools but weaker evidence for robots used in therapy. Wing-Chee et al. shift the robot role from therapy to screening and finally deliver the clear data reviewers asked for.

04

Why it matters

You now have a robot that can screen reliably in minutes. Use it while families wait for a full ADOS slot. The robot never gets tired, so you can run it twice if needed. Start thinking of robots as front-line triage, not toys.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Ask your clinic director if you can pilot the free HUMANE software tablet version during next month’s screenings.

02At a glance

Intervention
not applicable
Design
other
Sample size
199
Population
autism spectrum disorder, neurotypical
Finding
strongly positive
Magnitude
large

03Original abstract

The present exploratory cross-sectional case-control study sought to develop a reliable and scalable screening tool for autism using a social robot. The robot HUMANE, installed with computer vision and linked with recognition technology, detected the direction of eye gaze of children. Children aged 3-8 (M = 5.52; N = 199) participated, 87 of whom had been confirmed with autism, 55 of whom were suspected to have autism, and 57 of whom were not considered to cause any concern for having autism. Before a session, a human experimenter instructed HUMANE to narrate a story to a child. HUMANE prompted the child to return his/her eye gaze to the robot if the child looked away, and praised the child when it re-established its eye gaze quickly after a prompt. The reliability of eye gaze detection was checked across all pairs of human raters and HUMANE and reached 0.90, indicating excellent interrater agreement. Using the pre-specified reference standard (Autism Spectrum Quotient), the sensitivity and specificity of the index tests (i.e., the number of robot prompts and duration of inattentiveness) reached 0.88 or above and the Diagnostic Odds Ratios were beyond 190. These results show that social robots may detect atypical eye patterns, suggesting a potential future for screening autism using social robots.

Autism research : official journal of the International Society for Autism Research, 2024 · doi:10.1002/aur.3087