Contrasting Accounts of Early Speech Perception and Production
Echo a baby’s own sounds and add tangible reinforcers—automatic reinforcement drives early speech without brain modules.
01Research in Context
What this study did
Schlinger (2023) wrote a theory paper. He asked, "How do babies learn to talk?"
He compared two answers. Cognitive science says babies have built-in modules. Skinner says babies hear their own sounds and repeat the ones that feel good.
The paper argues the second answer is simpler and testable.
What they found
The paper found no new data. It showed that automatic reinforcement is enough to explain early speech.
Babbling that matches nearby speech gets echoed by adults. That echo is a reinforcer. No hidden brain software is needed.
How this fits with other research
Neuringer et al. (1968) proved the idea works. They gave toys for talking and saw speech shoot up in a quiet preschooler.
Hursh et al. (1974) went further. Imitation plus reinforcement created brand-new sentences in toddlers.
Layng et al. (2023) use the same Skinner book but focus on older kids. They say abstract tacts and autoclitic frames build grammar. Together the four papers form one ladder: babies get sounds, preschoolers get words, big kids get grammar—all through reinforcement.
Why it matters
You can stop hunting for hidden modules. Watch what sound a baby just made, echo it, and you become the reinforcer. Do the same with toys or play for late talkers. The mechanism is automatic, cheap, and already in your toolbox.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Pick one babble the baby just made, repeat it back immediately, then hand a small preferred toy—track if the sound grows in the next ten minutes.
02At a glance
03Original abstract
Language researchers have historically either dismissed or ignored completely behavioral accounts of language acquisition while at the same time acknowledging the important role of experience in language learning. Many language researchers have also moved away from theories based on an innate generative universal grammar and promoted experience-dependent and usage-based theories of language. These theories suggest that hearing and using language in its context is critical for learning language. However, rather than appealing to empirically derived principles to explain the learning, these theories appeal to inferred cognitive mechanisms. In this article, I describe a usage-based theory of language acquisition as a recent example of a more general cognitive linguistic theory and note both logical and methodological problems. I then present a behavior-analytic theory of speech perception and production and contrast it with cognitive theories. Even though some researchers acknowledge the role of social feedback (they rarely call it reinforcement) in vocal learning, they omit the important role played by automatic reinforcement. I conclude by describing automatic reinforcement as the missing link in a parsimonious account of vocal development in human infants and making comparisons to vocal development in songbirds.
Perspectives on Behavior Science, 2023 · doi:10.1007/s40614-023-00371-4