Rate of conditioned reinforcement affects observing rate but not resistance to change.
Frequent conditioned reinforcers boost how often a response happens but not how well it survives disruption.
01Research in Context
What this study did
The team worked with pigeons in a lab.
They changed how often a brief light came on.
The light acted as a conditioned reinforcer.
They wanted to see if more lights meant stronger behavior.
What they found
More lights did make the birds look more.
But the extra lights did not make the behavior tougher.
When the schedule changed, the birds slowed down the same.
Sometimes the birds on lean schedules held out longer.
How this fits with other research
Kohlenberg (1973) showed birds need the signal to stay on the whole time.
Nevin et al. (2005) used brief flashes and still got more looking.
The two studies differ on signal length, not on whether signals work.
Buskist et al. (1988) found brief signals keep pecking through short delays.
Nevin et al. (2005) adds that brief signals boost rate but not strength.
Together they say: signals help, but more signals do not equal bullet-proof behavior.
Why it matters
You can use praise or tokens more often to raise response rate.
Do not count on that alone to protect the skill from change.
Check if the behavior holds when you fade the extra praise.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Count the behavior under rich praise, then briefly withhold praise and count again to test true strength.
02At a glance
03Original abstract
The effects of rate of conditioned reinforcement on the resistance to change of operant behavior have not been examined. In addition, the effects of rate of conditioned reinforcement on the rate of observing have not been adequately examined. In two experiments, a multiple schedule of observing-response procedures was used to examine the effects of rate of conditioned reinforcement on observing rates and resistance to change. In a rich component, observing responses produced a higher frequency of stimuli correlated with alternating periods of random-interval schedule primary reinforcement or extinction. In a lean component, observing responses produced similar schedule-correlated stimuli but at a lower frequency. The rate of primary reinforcement in both components was the same. In Experiment 1, a 4:1 ratio of stimulus production was arranged by the rich and lean components. In Experiment 2, the ratio of stimulus production rates was increased to 6:1. In both experiments, observing rates were higher in the rich component than in the lean component. Disruptions in observing produced by presession feeding, extinction of observing responses, and response-independent food deliveries during intercomponent intervals usually were similar in the rich and lean components. When differences in resistance to change did occur, observing tended to be more resistant to change in the lean component. If resistance to change is accepted as a more appropriate measure of response strength than absolute response rates, then the present results provide no evidence that higher rates of stimuli generally considered to function as conditioned reinforcers engender greater response strength.
Journal of the experimental analysis of behavior, 2005 · doi:10.1901/jeab.2005.83-04