Autism & Developmental

Repetitive Thoughts and Repetitive Behaviors in Williams Syndrome.

Huston et al. (2022) · Journal of autism and developmental disorders 2022
★ The Verdict

Kids with high-functioning autism look away from mouths during speech, so prompt mouth gaze to boost their listening skills.

✓ Read this if BCBAs running social skills or language groups for verbal kids with autism.
✗ Skip if Clinicians serving non-verbal clients or adults with ID.

01Research in Context

01

What this study did

Mammarella et al. (2022) watched where kids looked while they listened to someone talk.

They compared the kids with high-functioning autism to 25 typical kids.

Each child sat in front of a screen that showed a face speaking simple words.

02

What they found

Kids with autism spent much less time looking at the speaker's mouth.

Instead, they looked at the wall or the speaker's hair.

Typical kids locked onto the mouth to match sounds with lip movements.

03

How this fits with other research

Muth et al. (2014) found that kids with autism also miss eye-contact cues.

Together, these studies show a wider pattern: kids with autism skip key parts of the face.

Falck-Ytter et al. (2012) adds that poorer gaze following links to lower communication scores.

Hartston et al. (2024) explains why: autistic learners build weaker mental pictures of faces.

So the mouth-avoidance is part of a bigger face-processing issue.

04

Why it matters

When you teach language or social skills, prompt the child to look at your mouth.

Say "Watch my lips" before giving an instruction.

Pair the prompt with a small edible or token.

Over time, the child may start linking mouth movements with sounds on their own.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Before giving a verbal instruction, point to your mouth and say "Look here"—then deliver the direction.

02At a glance

Intervention
not applicable
Design
case control
Population
autism spectrum disorder, neurotypical
Finding
negative
Magnitude
medium

03Original abstract

Conversation requires integration of information from faces and voices to fully understand the speaker's message. To detect auditory-visual asynchrony of speech, listeners must integrate visual movements of the face, particularly the mouth, with auditory speech information. Individuals with autism spectrum disorder may be less successful at such multisensory integration, despite their demonstrated preference for looking at the mouth region of a speaker. We showed participants (individuals with and without high-functioning autism (HFA) aged 8-19) a split-screen video of two identical individuals speaking side by side. Only one of the speakers was in synchrony with the corresponding audio track and synchrony switched between the two speakers every few seconds. Participants were asked to watch the video without further instructions (implicit condition) or to specifically watch the in-synch speaker (explicit condition). We recorded which part of the screen and face their eyes targeted. Both groups looked at the in-synch video significantly more with explicit instructions. However, participants with HFA looked at the in-synch video less than typically developing (TD) peers and did not increase their gaze time as much as TD participants in the explicit task. Importantly, the HFA group looked significantly less at the mouth than their TD peers, and significantly more at non-face regions of the image. There were no between-group differences for eye-directed gaze. Overall, individuals with HFA spend less time looking at the crucially important mouth region of the face during auditory-visual speech integration, which is maladaptive gaze behavior for this type of task.

Journal of autism and developmental disorders, 2022 · doi:10.1007/s10803-014-2075-0