Stimulus control of behavioral history.
Stimulus-linked reinforcement history can keep driving response rates long after the contingencies change.
01Research in Context
What this study did
Researchers worked with pigeons to see if past training colors could steer later choices. Each bird first pecked under a red light for one schedule and under a green light for another schedule.
Later the birds faced the same schedules, but now both colors appeared. The team watched whether the old color rules still guided pecking speed even though the new rules were identical.
What they found
The pigeons kept the old pace tied to each color for many sessions. Red made them peck fast; green made them peck slow, even when the payoff rates were now the same.
How this fits with other research
Matson et al. (1994) ran a similar test with progressive-ratio schedules. They also saw history effects, but the differences faded fast. The contrast shows that history lasts longer when clear stimuli like colors signal the old rules.
Adkins et al. (1997) pushed the idea further. They gave birds DRH or DRL histories, waited six months, then tested on VI. The old high vs. low rates returned, proving these stimulus-bound memories can survive even longer breaks.
Cowie et al. (2016) pull these findings together. Their review says what looks like reinforcer control is often stimulus control. The pigeon color cues fit that story: the colors, not the grain, directed the lasting changes.
Why it matters
When a client stalls or rushes under a new program, ask what stimuli were present during past reinforcement. Swap room layout, therapist, or token board to see if behavior tracks those old cues instead of the new contingencies.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Run a quick probe: switch the teaching stimulus (e.g., change card color or room) and measure if response rate shifts with it.
02At a glance
03Original abstract
Pigeons were exposed to two different reinforcement schedules under different stimulus conditions in each of two daily sessions separated by 6 hr (Experiments 1 and 2) or in a single session (Experiment 3). Following this, either a fixed-interval (Experiment 1) or a variable-interval schedule (Experiments 2 and 3) was effected in both stimulus conditions. In the first two experiments, exposure to fixed-ratio or differential-reinforcement-of-low-rate schedules led to response-rate, but not pattern, differences in subsequent performance on fixed- or variable-interval schedules that persisted for up to 60 sessions. The effects of reinforcement-schedule history on fixed-interval schedule performance generally were more persistent. In Experiment 3, a history of high and low response rates in different components of a multiple schedule resulted in subsequent response-rate differences under identical variable-interval schedules. Higher response rates initially occurred in the component previously correlated with high response rates. For 3 of 4 subjects, the differences persisted for 20 or more sessions. Previous demonstrations of behavioral history effects have been confined largely to between-subject comparisons. By contrast, the present results demonstrate strong behavioral effects of schedule histories under stimulus control within individual subjects.
Journal of the experimental analysis of behavior, 1992 · doi:10.1901/jeab.1992.57-5