Intercurrent and reinforced behavior under multiple spaced-responding schedules.
Long DRL schedules cut target responses but can spark new, un-trained behaviors like excessive movement.
01Research in Context
What this study did
Four lab rats lived in a box with a lever, a water spout, and a running wheel.
The box switched every hour between four rules: wait 10 s, 20 s, or 60 s before pressing, or no presses paid at all (extinction).
The team counted lever presses, water licks, wheel turns, and earned pellets across many days.
What they found
When the wait rule grew longer, rats pressed less and drank less.
Wheel running jumped up during the 60-s wait and extinction parts.
Extra exercise appeared only when reinforcement was scarce—classic schedule-induced adjunctive behavior.
How this fits with other research
HOFFMAN et al. (1963) showed that old fear cues fade with repeated extinction tests. Hart et al. (1974) now shows that extinction also births new, un-trained responses like wheel running.
Madden et al. (2003) used a similar multi-part schedule—VI, DRO, extinction—to bring cocaine taking under stimulus control. Their rats, like these 1974 rats, proved that schedule parts, not the reinforcer type, shape the pattern.
Einfeld et al. (1995) found pigeons picked immediate payoff even when it hurt long-term gains. The 1974 rats did the same: they ran for now rather than waited for later food, showing impulsive choice under lean schedules.
Why it matters
If you run DRL to slow a behavior, watch for new problems popping up elsewhere. A child who stops hand mouthing may start pacing or tapping. Track all movements, not just the target one, and plan adjunctive outlets before they surprise you.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Add a second data sheet that logs any new movement, not just the target response, when you stretch the DRL interval.
02At a glance
03Original abstract
Lever pressing in rats was reinforced with food under a multiple spaced-responding schedule. A lever, food cup, and drinking tube were mounted in a running wheel so that lever pressing, running, and licking could be recorded. Running and licking had no scheduled consequences. Lever pressing was reinforced under a multiple schedule with three spaced-responding components and an extinction component. Each component was associated with a different auditory stimulus. Spaced-responding components reinforced only lever presses terminating interresponse times equal to or greater than 10, 20, or 60 sec, respectively. Rates of lever pressing, reinforcement, and licking all decreased as schedule parameter increased. Efficiency of spaced responding, as measured by reinforcements per response, also decreased. Rate of wheel running either increased or increased and then decreased with increasing schedule parameter. Individual running rates differed substantially. Neither licking nor running rate correlated with individual differences in efficiency. Analysis of conditional probabilities among the several response classes showed that, as the schedule requirement increased, the probability of running after a lever press increased and the probability of licking after a lever press decreased. After reinforcement, one subject always pressed the lever next. In the other subjects, the conditional probability of lever pressing, given reinforcement, increased while the probability of licking, given reinforcement, decreased with increasing schedule requirement. Results are discussed in relation to the concepts of schedule-induced and mediating behavior.
Journal of the experimental analysis of behavior, 1974 · doi:10.1901/jeab.1974.21-445