Assessment & Research

Identifying Autism with Head Movement Features by Implementing Machine Learning Algorithms.

Zhao et al. (2022) · Journal of autism and developmental disorders 2022
★ The Verdict

A two-minute webcam clip can flag ASD risk by measuring repetitive head loops and wide turns during chat.

✓ Read this if BCBAs who screen or reassess school-age clients in clinic or school rooms.
✗ Skip if Practitioners working solely with infants or non-speaking adults.

01Research in Context

01

What this study did

The team filmed the kids while they chatted with an adult for two minutes. Half had ASD, half were typically developing.

Free software called OpenFace 2.0 tracked every tiny head tilt, nod, and shake. The computer then counted how often the head repeated the same path and how far it turned.

02

What they found

Kids with ASD moved their heads in tighter, more repetitive loops. Their range of motion was also wider.

These movement patterns alone let the computer pick out the ASD group with high accuracy. Head stereotypy worked better than eye-gaze data for the flag.

03

How this fits with other research

Zhao et al. (2023) used the same chat task but tracked eyes, not heads. They saw less face looking in ASD, yet the head paper shows more head motion. Together they paint a full picture: the child looks away more AND moves more.

Avni et al. (2020) also mined video for an ASD marker, but measured gaze idiosyncrasy during movies. Both studies prove cheap webcams can yield solid biomarkers without extra gear.

Kovarski et al. (2019) found faster eye jumps in ASD. Zhong’s head stereotypy result adds a second, slower motor signature that is easier to spot in real time.

04

Why it matters

You can run OpenFace on any tablet recording. During intake or social-skills sessions, let the camera roll for two minutes. If the software shows high head-loop counts, consider a fuller ASD evaluation. No markers, no stickers, just a normal conversation.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Record your next conversation session, run OpenFace 2.0, and note head-stereotypy scores above the 75th percentile for referral review.

02At a glance

Intervention
not applicable
Design
other
Population
autism spectrum disorder, neurotypical
Finding
positive

03Original abstract

The present study implemented an objective head pose tracking technique-OpenFace 2.0 to quantify the three dimensional head movement. Children with autism spectrum disorder (ASD) and typical development (TD) were engaged in a structured conversation with an interlocutress while wearing an eye tracker. We computed the head movement stereotypy with multiscale entropy analysis. In addition, the head rotation range (RR) and the amount of rotation per minute (ARPM) were calculated to quantify the extent of head movement. Results demonstrated that the ASD group had significantly higher level of movement stereotypy, RR and ARPM in all the three directions of head movement. Further analyses revealed that the extent of head movement could be significantly explained by movement stereotypy, but not by the amount of visual fixation to the interlocutress. These results demonstrated the atypical head movement dynamics in children with ASD during live interaction. It is proposed that head movement might potentially provide novel objective biomarkers of ASD. LAY SUMMARY: Our study used an objective tool to quantify head movement in children with autism. Results showed that children with autism had more stereotyped and greater head movement. We suggest that head movement tracking technique be widely used in autism research.

Journal of autism and developmental disorders, 2022 · doi:10.1002/aur.2478