Electrophysiological alterations in motor-auditory predictive coding in autism spectrum disorder.
Autistic adults do not turn down the brain response to their own button-press tones, showing a clear break in motor-auditory predictive coding.
01Research in Context
What this study did
van Laarhoven et al. (2019) wired up two groups: autistic adults and neurotypical adults. Each person pressed a button that triggered a short tone. The team measured the brain’s N1 response, a quick electrical wave that shows the brain noticed the sound.
The idea was simple. When you cause a sound yourself, your brain usually turns the volume down on that sound. This dampening is called predictive coding. The study asked: do autistic brains do the same?
What they found
Neurotypical brains showed the normal dip in N1 size for self-made tones. Autistic brains kept the same large N1, as if every tone came from outside. In plain words, they did not predict the sensory result of their own action.
This missing dampening points to a broken internal model: the brain is not using the motor plan to mute incoming sound.
How this fits with other research
Adams et al. (2024) ran a similar EEG setup and also found slower, weaker N1 waves in autistic adults. They linked the size of the N1 to real-life sensory and social scores, showing the lab marker matters outside the lab.
Zhao et al. (2024) looked like a contradiction at first. Their autistic group predicted musical notes just fine, suggesting predictive coding was intact. The twist: they tested music, not self-made sounds. Prediction can work in one domain and fail in another, so check the task before assuming a global deficit.
Qi et al. (2025) pulled ten language studies into one review. Kids with autism showed some word prediction but failed when tone of voice mattered. Together with Thijs, the pattern is clear: autistic brains struggle when the cue comes from their own action or from subtle social prosody.
Why it matters
If a client does not dampen self-made sounds, they may hear every tap, keystroke, or own voice burst at full volume. This overload can feed avoidance and self-stim behavior. You can test this quickly in session: have the client press a button that makes a tone while you record a simple EEG or even watch for flinches. Then build interventions that add predictable sound pairs with reward, teaching the brain to expect and mute the input. Keep your instructions clear and semantic, not prosodic, and remember that musical games might spare prediction skills even when motor-auditory tasks do not.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Add a simple cause-and-sound game: client taps a switch, hears a brief beep, and gets a token if they stay seated and quiet for three rounds.
02At a glance
03Original abstract
The amplitude of the auditory N1 component of the event-related potential (ERP) is typically attenuated for self-initiated sounds, compared to sounds with identical acoustic and temporal features that are triggered externally. This effect has been ascribed to internal forward models predicting the sensory consequences of one's own motor actions. The predictive coding account of autistic symptomatology states that individuals with autism spectrum disorder (ASD) have difficulties anticipating upcoming sensory stimulation due to a decreased ability to infer the probabilistic structure of their environment. Without precise internal forward prediction models to rely on, perception in ASD could be less affected by prior expectations and more driven by sensory input. Following this reasoning, one would expect diminished attenuation of the auditory N1 due to self-initiation in individuals with ASD. Here, we tested this hypothesis by comparing the neural response to self- versus externally-initiated tones between a group of individuals with ASD and a group of age matched neurotypical controls. ERPs evoked by tones initiated via button-presses were compared with ERPs evoked by the same tones replayed at identical pace. Significant N1 attenuation effects were only found in the TD group. Self-initiation of the tones did not attenuate the auditory N1 in the ASD group, indicating that they may be unable to anticipate the auditory sensory consequences of their own motor actions. These results show that individuals with ASD have alterations in sensory attenuation of self-initiated sounds, and support the notion of impaired predictive coding as a core deficit underlying autistic symptomatology. Autism Res 2019, 12: 589-599. © 2019 The Authors. Autism Research published by International Society for Autism Research published by Wiley Periodicals, Inc. LAY SUMMARY: Many individuals with ASD experience difficulties in processing sensory information (for example, increased sensitivity to sound). Here we show that these difficulties may be related to an inability to anticipate upcoming sensory stimulation. Our findings contribute to a better understanding of the neural mechanisms underlying the different sensory perception experienced by individuals with ASD.
Autism research : official journal of the International Society for Autism Research, 2019 · doi:10.1093/scan/nsq088