Variable-ratio versus variable-interval schedules: response rate, resistance to change, and preference.
VR schedules give you speed and learner preference, yet VI schedules give you staying power when reinforcement gets spotty.
01Research in Context
What this study did
The team compared two classic schedules: variable-ratio (VR) and variable-interval (VI).
Pigeons pecked for food under each schedule while the researchers kept the overall rate of food the same.
Later they tested how hard the birds would keep working when food was given for free or when the lights were dimmed.
What they found
VR birds pecked faster, but they slowed down more when free food showed up.
VI birds pecked slower, yet they kept going longer during the tough tests.
When the birds could choose, they picked the schedule that matched their faster response rate.
How this fits with other research
Clark et al. (1977) saw the same speed boost from VR twenty-four years earlier, so the basic rate difference is solid.
STAATHOFFMAN et al. (1964) found the same pattern with children reading words aloud—VR made them read faster than VI, showing the rule crosses species and tasks.
Arantes et al. (2012) looked at resistance plus choice too, but they changed the response pattern instead of the schedule. Both studies agree: whatever produces higher rates is also what learners prefer, yet it breaks down faster when conditions worsen.
Why it matters
If you want quick, energetic responding—like rapid table-touch drills—use VR, but know the behavior may fade fast when reinforcement dips. If you need steady, durable responding—like waiting calmly during group time—lean toward VI or mix in VI components. Check both rate and staying power; fast today does not always mean strong tomorrow.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Count responses per minute for one target behavior, then briefly probe it after giving free tokens; note if high-rate VR responding drops fastest.
02At a glance
03Original abstract
Two experiments asked whether resistance to change depended on variable-ratio as opposed to variable-interval contingencies of reinforcement and the different response rates they establish. In Experiment 1, pigeons were trained on multiple random-ratio random-interval schedules with equated reinforcer rates. Baseline response rates were disrupted by intercomponent food, extinction, and prefeeding. Resistance to change relative to baseline was greater in the interval component, and the difference was correlated with the extent to which baseline response rates were higher in the ratio component. In Experiment 2, pigeons were trained on multiple variable-ratio variable-interval schedules in one half of each session and on concurrent chains in the other half in which the terminal links corresponded to the multiple-schedule components. The schedules were varied over six conditions, including two with equated reinforcer rates. In concurrent chains, preference strongly overmatched the ratio of obtained reinforcer rates. In multiple schedules, relative resistance to response-independent food during intercomponent intervals, extinction, and intercomponent food plus extinction depended on the ratio of obtained reinforcer rates but was less sensitive than was preference. When reinforcer rates were similar, both preference and relative resistance were greater for the variable-interval schedule, and the differences were correlated with the extent to which baseline response rates were higher on the variable-ratio schedule, confirming the results of Experiment 1. These results demonstrate that resistance to change and preference depend in part on response rate as well as obtained reinforcer rate, and challenge the independence of resistance to change and preference with respect to response rate proposed by behavioral momentum theory.
Journal of the experimental analysis of behavior, 2001 · doi:10.1901/jeab.2001.76-43