Changeover ratio effects on concurrent variable-interval performance.
Making kids "pay" to switch tasks pushes them to stay put, even when the other option pays better.
01Research in Context
What this study did
The team used two VI schedules side by side. Pigeons could hop between them.
Each hop cost a set number of extra pecks. The cost rose across days.
The goal: see if a higher hop cost changes how birds split their time.
What they found
When the hop cost grew, birds switched less.
They stayed longer on one key and pecked it faster.
Oddly, uneven hop costs made birds prefer the leaner side. Reinforcer rate alone did not rule choice.
How this fits with other research
Jones et al. (1998) later showed that even after a hop, birds need time to "see" which schedule is richer. Memory fades in seconds.
Hunter et al. (1985) built a math model: after you shift reinforcer ratios, response ratios need about five sessions to settle. White (1979) gave the first hint that hop rules, not just rates, steer that drift.
Wearden et al. (1983) moved the two keys farther apart. Distance acted like extra hop pecks: less switching, stronger bias. Together the papers say the same thing—any cost to move, in pecks or feet, bends choice away from matching.
Why it matters
Your client faces hop costs too—walking to a different table, putting down a toy, pressing "next screen." If you want balanced responding across two tasks, keep the switch easy. Lower the response cost, or kids will camp on the richer side just like the pigeons.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Cut extra steps between two stations—place materials shoulder width apart and skip token requirements for the move.
02At a glance
03Original abstract
Rats' bar-pressing was maintained by concurrent variable-interval schedules of reinforcement. A fixed-ratio of pulls on a chain (the changeover ratio) was required for switching between schedules. The first experiment employed equal variable-interval schedules and symmetrical changeover ratios. Increasing these ratios resulted in a decrease in the rate of switching between schedules and an increase in local response rate. In the second experiment, a range of asymmetrical changeover ratios was used with equal variable-interval schedules, and a preference was found for the schedule associated with the larger switching-into ratio. Both the distributions of responses and time between the two schedules deviated from those expected on the basis of obtained reinforcers. In the third experiment, the switching-out-of ratio was dependent on the amount of time spent in a variable-interval 2-minute schedule; a constant ratio permitted switching out of the alternative variable-interval 1-minute schedule. A strong preference was shown for the variable-interval 2-minute schedule. The fourth experiment used equal variable-interval schedules; one changeover ratio was varied while the second remained constant. The results failed to show systematic differences in local response rates immediately after a changeover.
Journal of the experimental analysis of behavior, 1979 · doi:10.1901/jeab.1979.31-239