Smartphone-based gaze estimation for in-home autism research.
A smartphone camera can spot the classic autism gaze signature as well as a lab eye-tracker, letting you collect data on the family sofa.
01Research in Context
What this study did
Kim et al. (2024) asked if a regular smartphone can track where kids look.
They filmed the kids with autism and 12 typical kids while each child watched a short video.
Half the clips ran in a lab, half ran later in the child’s own living room.
The phone’s front camera guessed gaze point; a $30,000 eye-tracker ran at the same time for comparison.
What they found
Phone gaze was off by less than one degree—about the width of a pencil tip held at arm’s length.
Kids with autism spent less time on faces and more time on background objects, matching the expensive lab machine beat for beat.
Parents set the phone on the couch arm; no wires, no chin rest, no travel.
How this fits with other research
Wan et al. (2019) already showed a 10-second lab clip can sort ASD from typical kids with a large share accuracy using a high-end eye-tracker. Yeon’s team proves the same signature shows up through a phone, so you can now chase that a large share outside the clinic.
Falck-Ytter et al. (2012) and Bigham et al. (2013) mapped odd gaze following in lab toddlers; the new data say those patterns survive in older kids and can be caught without leaving home.
Lesser et al. (2019) validated a cheap bedroom camera for sleep scoring. Yeon repeats the playbook—swap sensor, swap target—but keeps the same goal: research-grade numbers from everyday hardware.
Why it matters
You no longer need grant money for eye-tracking rigs. Hand the family a study link, they open it on their phone, and you collect gaze data between dinner and bedtime. Use it to screen social attention, track intervention progress, or simply include families who live far from your lab.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Text caregivers a 30-second video link; ask them to prop the phone, hit record, and send back the clip so you can check face-looking time before next session.
02At a glance
03Original abstract
Atypical gaze patterns are a promising biomarker of autism spectrum disorder. To measure gaze accurately, however, it typically requires highly controlled studies in the laboratory using specialized equipment that is often expensive, thereby limiting the scalability of these approaches. Here we test whether a recently developed smartphone-based gaze estimation method could overcome such limitations and take advantage of the ubiquity of smartphones. As a proof-of-principle, we measured gaze while a small sample of well-assessed autistic participants and controls watched videos on a smartphone, both in the laboratory (with lab personnel) and in remote home settings (alone). We demonstrate that gaze data can be efficiently collected, in-home and longitudinally by participants themselves, with sufficiently high accuracy (gaze estimation error below 1° visual angle on average) for quantitative, feature-based analysis. Using this approach, we show that autistic individuals have reduced gaze time on human faces and longer gaze time on non-social features in the background, thereby reproducing established findings in autism using just smartphones and no additional hardware. Our approach provides a foundation for scaling future research with larger and more representative participant groups at vastly reduced cost, also enabling better inclusion of underserved communities.
Autism research : official journal of the International Society for Autism Research, 2024 · doi:10.1002/aur.3140