ABA Fundamentals

Preference for intermittent reinforcement.

Kendall (1974) · Journal of the experimental analysis of behavior 1974
★ The Verdict

Clear signals make intermittent reinforcement preferable—even over 100 % payoff—so use salient cues when thinning schedules.

✓ Read this if BCBAs shaping new skills or fading reinforcement in clinic or classroom settings.
✗ Skip if Practitioners working only with continuous reinforcement or non-contingent reward systems.

01Research in Context

01

What this study did

Kendall (1974) let pigeons pick between two keys. One key always paid off. The other paid off only sometimes.

Colored lights flashed when food was coming. The birds could see the difference between payoff and no-payoff moments.

The setup is called a concurrent-chain schedule. Birds first chose which chain to enter, then worked inside that chain.

02

What they found

The birds usually picked the key that only sometimes paid off. They liked intermittent reinforcement better than 100 % reinforcement.

The colored lights made the contingency clear. When the birds could see "food coming" versus "no food," they stayed with the leaner schedule.

03

How this fits with other research

HERRNSTEISLOANE (1964) already showed pigeons prefer variable intervals over fixed ones. Kendall (1974) adds the twist: signals make the preference stronger.

Fantino (1967) found birds like mixed ratios when the average payoff beats a fixed ratio. The 1974 study shows the signal, not just the math, drives the choice.

Clark et al. (1970) showed brief stimuli paired with food work as conditioned reinforcers. Kendall (1974) uses those same brief stimuli to explain why birds stick with intermittent payoff.

04

Why it matters

Your client may pick a lean schedule if the contingency is obvious. Use clear signals—tokens, lights, or praise—right before reinforcement. This can make thinner schedules more acceptable, helping you stretch reinforcement without losing engagement.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Add a brief, unique cue (like a colored card or specific word) right before each reinforced response while thinning the schedule.

02At a glance

Intervention
other
Design
single case other
Finding
positive

03Original abstract

Two experiments were conducted demonstrating that under certain conditions pigeons may peck at a higher rate on a key that produces intermittent reinforcement following a delay than on one that always produces reinforcement following the same delay duration. In both experiments, concurrent chain schedules were employed. In Experiment I, a single peck on one key led to a white light and a delay of 15 sec, which always terminated with food. A peck on the other key led to its illumination by one of two colored lights and a delay period of 15 sec. The delay was followed by either food presentation or timeout, either one lasting 3 sec. In a control group, the lights on this key were not correlated with food or timeout. Under the correlated stimuli, birds more often pecked the key leading to intermittent reinforcement, whereas with uncorrelated stimuli they pecked the key leading to the white light and 100% reinforcement. In Experiment II, concurrent variable-interval schedules were employed in the first link. The results showed generally that the relative rate was higher on the key leading to intermittent reinforcement when the stimuli were correlated with reinforcement and timeout than on the key leading to 100% reinforcement. There was some indication that this performance was affected by (1) the duration of the delay, (2) the percentage of reinforcement on the key yielding the higher percentage of reinforcement (the key with the white light), and (3) prior experimental conditions.

Journal of the experimental analysis of behavior, 1974 · doi:10.1901/jeab.1974.21-463