ABA Fundamentals

Choice in a variable environment: effects of blackout duration and extinction between components.

Davison et al. (2002) · Journal of the experimental analysis of behavior 2002
★ The Verdict

A 60-second blackout between schedule components erases leftover preference, giving you a cleaner slate.

✓ Read this if BCBAs who use rapidly changing reinforcement schedules or concurrent teaching arrangements.
✗ Skip if Clinicians working on single-operant skill acquisition with no schedule switching.

01Research in Context

01

What this study did

Davison et al. (2002) asked a simple question. How long should the dark pause be between two schedules? They used pigeons on two keys. The keys paid off at different rates that changed every few minutes.

Between swaps they either turned the lights off (a blackout) or left the lights on but stopped paying (extinction). They tested dark pauses of 0, 15, 30, or 60 seconds.

02

What they found

Longer blackouts scrubbed away the birds' earlier preference. After 60 seconds in the dark, choice almost reset. Sixty seconds of extinction did the same job, but a faint bias still lingered.

With no pause, the birds kept favoring the key that had just paid better. The carry-over vanished once the blackout reached one minute.

03

How this fits with other research

de Rose (1986) and Ginsburg et al. (1971) showed the opposite trick: longer extinction boosts later responding. Their birds worked harder after a long non-pay stretch. Michael's birds, given a long blackout, forgot the old payoff. The twist is what is being measured—C and N looked at response rate, Michael looked at choice bias.

Wildemann et al. (1973) already matched blackout to extinction for peak shift. Michael confirms the two tactics can feel the same to the subject, yet blackout wipes the slate cleaner.

Smith et al. (2022) stretch the idea to resurgence: the length of past phases steers future relapse. Michael shows the same principle in preference carry-over.

04

Why it matters

If you run mixed schedules or token boards that flip values, insert at least a 60-second blackout or non-reinforcement gap. This lets the learner start fresh instead of dragging old biases into the new contingency. One easy move: add a one-minute 'reset' break before you change the reinforcement rate.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Before you switch reinforcement ratios, give a 60-second lights-off or no-pay pause.

02At a glance

Intervention
not applicable
Design
single case other
Population
not specified
Finding
not reported

03Original abstract

Pigeons were trained in a procedure in which sessions included seven four- or 10-reinforcer components, each providing a different reinforcer ratio that ranged from 27:1 to 1:27. The components were arranged in random order, and no signals differentiated the component reinforcer ratios. Each condition lasted 50 sessions, and the data from the last 35 sessions were analyzed. Previous results using 10-s blackouts between components showed some carryover of preference from one component to the next, and this effect was investigated in Experiment 1 by varying blackout duration from 1 s to 120 s. The amount of carryover decreased monotonically as the blackout duration was lengthened. Preference also decreased between reinforcers within components, suggesting that preference change during blackout might follow the same function as preference change between reinforcers. Experiment 2 was designed to measure preference change between components more directly and to relate this to preference change during blackout. In two conditions a 60-s blackout occurred between components, and in two other conditions a 60-s period of unsignaled extinction occurred between components. Preference during the extinction period progressively fell toward indifference, and the level of preference following extinction was much the same as that following blackout. Although these results are consistent with Davison and Baum's (2000) theory of the effects of reinforcers on local preference, other findings suggest that theory is incomplete: After a sequence of reinforcers from one alternative, some residual preference remained after 60 s of extinction or blackout, indicating the possibility of an additional longer term accumulation of reinforcer effects than originally suggested.

Journal of the experimental analysis of behavior, 2002 · doi:10.1901/jeab.2002.77-65