Assessment & Research

Categories in the pigeon brain: A reverse engineering approach.

Koenen et al. (2016) · Journal of the experimental analysis of behavior 2016
★ The Verdict

Pigeon brains sort photos from stripes on their own, so some stimulus classes may not need heavy teaching.

✓ Read this if BCBAs who study stimulus control or build visual-discrimination programs.
✗ Skip if Clinicians looking for direct child-intervention data.

01Research in Context

01

What this study did

Scientists wired up pigeons and watched their brains while the birds looked at pictures.

Some pictures showed real objects like trees or people. Others showed only black-and-white stripes.

The team asked: do brain cells sort these two types on their own, with no training?

02

What they found

The brain signals fell into two clear groups: one for photos, one for stripes.

The birds’ pecking followed the same split, even though they had never been rewarded for choosing either type.

Neural data alone told us how the birds would act.

03

How this fits with other research

Madsen et al. (1968) first showed that pigeons can learn to tell shapes apart if we reward them. Koenen et al. (2016) steps further: the birds already group sights before any reward is given.

Corrigan et al. (1998) proved pigeons can sort moving objects into concepts like “pecking” versus “walking.” The new work shows the same kind of sorting happens in the brain for still pictures versus patterns.

Williams (1974) used simple peck tests to reveal hidden vision effects. Charlotte et al. swap pecks for wires and still uncover hidden brain rules, proving both old and new tools can map pigeon perception.

04

Why it matters

You can now think of categories as pre-built hardware, not just taught skills. When a client struggles to tell faces from objects, check if the task matches natural brain groups before adding extra rewards. Try mixing photos and patterns in early discrimination probes; the neural split may give you a head start without heavy training.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Test untrained photo-vs-pattern probes first; let natural brain bias do part of the work.

02At a glance

Intervention
not applicable
Design
other
Population
not specified
Finding
positive

03Original abstract

Pigeons are well known for their visual capabilities as well as their ability to categorize visual stimuli at both the basic and superordinate level. We adopt a reverse engineering approach to study categorization learning: Instead of training pigeons on predefined categories, we simply present stimuli and analyze neural output in search of categorical clustering on a solely neural level. We presented artificial stimuli, pictorial and grating stimuli, to pigeons without the need of any differential behavioral responding while recording from the nidopallium frontolaterale (NFL), a higher visual area in the avian brain. The pictorial stimuli differed in color and shape; the gratings differed in spatial frequency and amplitude. We computed representational dissimilarity matrices to reveal categorical clustering based on both neural data and pecking behavior. Based on neural output of the NFL, pictorial and grating stimuli were differentially represented in the brain. Pecking behavior showed a similar pattern, but to a lesser extent. A further subclustering within pictorial stimuli according to color and shape, and within gratings according to frequency and amplitude, was not present. Our study gives proof-of-concept that this reverse engineering approach-namely reading out categorical information from neural data--can be quite helpful in understanding the neural underpinnings of categorization learning.

Journal of the experimental analysis of behavior, 2016 · doi:10.1002/jeab.179