An analysis of reinforcement history effects.
Old reinforcement rate alone can speed up or slow down new FI performance, so always check the client’s schedule history.
01Research in Context
What this study did
Hiroto’s team worked with pigeons in a small lab chamber.
First, the birds pecked on two different schedules. One key gave grain every 30 s. The other gave grain every 120 s.
Next, both keys switched to the same 30-second fixed-interval schedule. The researchers counted pecks to see if the earlier grain rate still mattered.
What they found
Birds that came from the lean 120-s schedule pecked faster on the new 30-s schedule.
The short 30-s history birds pecked slower.
Only the old grain rate, not the old peck rate, caused the difference.
How this fits with other research
Dove et al. (1974) saw the same thing earlier: pigeons with a stiff fixed-ratio history later sped up under response-free fixed-time schedules.
Davison et al. (1991) matched the finding in rats. Rats with a long-inter-response-time history kept low rates on FI even when water was available for extra sips.
Fahmie et al. (2013) extended the idea to choice. Birds picked the key that once had longer changeover delays even when both keys now paid the same. The history effect now guides preference, not just speed.
Why it matters
Your client’s past schedule can quietly push their current response rate up or down. Before you judge a new program as “too slow” or “too fast,” probe for earlier reinforcement rates in home, school, or previous clinics. Start sessions with a brief assessment phase or alternate easy tasks to wash out old history effects.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Run a 2-minute probe at the start of session to see response rate before you count it as baseline.
02At a glance
03Original abstract
Four pigeons were exposed to two tandem variable-interval differential-reinforcement-of-low-rate schedules under different stimulus conditions. The values of the tandem schedules were adjusted so that reinforcement rates in one stimulus condition were higher than those in the other, even though response rates in the two conditions were nearly identical. Following this, a fixed-interval schedule of either shorter or longer values than, or equal to the baseline schedule, was introduced in the two stimulus conditions respectively. Response rates during those fixed-interval schedules typically were higher in the presence of the stimuli previously correlated with the lower reinforcement rates than were those in the presence of the stimuli previously correlated with the higher reinforcement rates. Such effects of the reinforcement history were most prominent when the value of the fixed-interval schedule was shorter. The results are consistent with both incentive contrast and response strength conceptualizations of related effects. They also suggest methods for disentangling the effects of reinforcement rate on subsequent responding, from the response rate with which it is confounded in many conventional schedules of reinforcement.
Journal of the experimental analysis of behavior, 2006 · doi:10.1901/jeab.2006.75-05