ABA Fundamentals

Concurrent schedules of primary and conditioned reinforcement in rats.

Zimmerman (1969) · Journal of the experimental analysis of behavior 1969
★ The Verdict

Conditioned reinforcers can run the show even when primary reinforcers stay the same.

✓ Read this if BCBAs shaping skill speed or teaching social chains.
✗ Skip if Clinicians who only use edibles and never pair secondary cues.

01Research in Context

01

What this study did

Rats pressed two levers. One lever gave food. The other gave a brief light or sound that had been paired with food.

The schedules for these two reinforcers changed across sessions. The team watched which schedule the rat followed.

02

What they found

The light or sound could speed up or slow down pressing even when food stayed on the same schedule.

In other words, the conditioned reinforcer, not the food, set the pace.

03

How this fits with other research

Neuringer (1973) later showed that making the reinforcer depend on the response itself boosts control even more.

LeBlanc et al. (2003) repeated the basic setup and found the same pattern: schedule context, not just food, steers behavior.

Benvenuti et al. (2024) now use this idea to explain social turn-taking: the brief eye-contact or nod acts like the light, guiding who speaks next.

04

Why it matters

Your praise, tokens, or brief visuals can override the edible you think is "in charge." If you want faster work, put your social cue on a rich schedule and keep the snack lean. If you want slower, do the opposite. Check which cue is really driving the rate.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Pair your praise phrase with a bite, then thin the bite while keeping praise every 5 s; watch if the pace stays high.

02At a glance

Intervention
not applicable
Design
single case other
Population
not specified
Finding
mixed

03Original abstract

Rats responded on a fixed-interval schedule during which a 3-sec stimulus preceded each water reinforcement. The stimulus was then scheduled concurrently for responses on the same lever according to either a variable ratio. Although water reinforcement continued on a fixed-interval schedule, the pattern of responding became typical of a variable-interval or variable-ratio schedule. When the 3-sec stimulus was presented on a variable-interval or variable-ratio schedule, but was omitted on the fixed-interval schedule, the response rate decreased. When the stimulus occurred after the same time periods as those of the variable-interval schedule, but at least 7-sec after the last response, the rate decreased. The rate became higher when the fixed-interval schedule was discontinued and each presentation of the 3-sec stimulus was followed by water on a variable-interval schedule. When both water and the 3-sec stimulus were discontinued for a period of time, resulting in extinction of the lever response, and the 3-sec stimulus alone then presented on a variable-interval or variable-ratio schedule after lever responses, rate increased and then gradually decreased.

Journal of the experimental analysis of behavior, 1969 · doi:10.1901/jeab.1969.12-261