ABA Fundamentals

Uncertainty reduction, conditioned reinforcement, and observing.

Fantino et al. (1980) · Journal of the experimental analysis of behavior 1980
★ The Verdict

Signals are reinforcing when they cut wait time, not when they merely remove doubt.

✓ Read this if BCBAs who use visual or auditory cues to set up reinforcement.
✗ Skip if Clinicians working only with immediate reinforcement and no cues.

01Research in Context

01

What this study did

Researchers let pigeons choose between two keys. One key lit up with signals about upcoming food. The other key gave the same food with no signals.

They wanted to know why birds like signals. Is it because signals remove doubt? Or because they mark time until food?

02

What they found

The birds always picked the key with signals. Even when both keys gave food at the same times, they still wanted the signals.

The signals were liked because they shaved time off the wait, not because they removed uncertainty.

03

How this fits with other research

Weil (1984) later showed the same rule in everyday pigeon pecking. When food was delayed, birds pecked less, proving the delay-reduction idea works outside special choice setups.

Gentry et al. (1980) worked in the same lab the same year. They also used two-key choice but varied the exact seconds of delay. Together the papers show both relative and absolute delay matter.

Parmenter (1999) kept the delay theme but asked if bigger food piles make birds more patient. They found pile size does not change patience, so delay still rules choice.

04

Why it matters

When you give clients signals about reinforcement, make the signal happen close to the good thing, not far ahead. A two-second warning beats a ten-second warning even if both tell the same news. Shorten the gap and you raise the value of your cue.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Move your readiness cue closer to delivery time; try two seconds before reinforcement instead of ten.

02At a glance

Intervention
not applicable
Design
single case other
Population
not specified
Finding
positive

03Original abstract

In a concurrent-chains procedure, pigeons chose between equivalent mixed and multiple fixed-interval schedules of reinforcement. In the first experiment, preference for the multiple schedule was higher when the probability of the shorter fixed interval was less than .50 than for complementary points, an outcome consistent with the delay-reduction hypothesis of conditioned reinforcement and observing, but inconsistent with the uncertainty-reduction hypothesis which requires symmetrical preferences with a maximum when the two intervals are equiprobable. A second experiment assessed preference for equivalent mixed and multiple schedules when each choice outcome resulted in two reinforcements, one on the longer and one on the shorter fixed interval. The order of the two fixed intervals was determined probabilistically. Pigeons again preferred multiple to mixed schedules, although multiple-schedule preference did not vary systematically with the likelihood of the shorter fixed interval occurring first. The results from these choice procedures are consistent with those from the observing-response literature in suggesting that the strength of a stimulus cannot be well described as a function of the degree of uncertainty reduction the stimulus provides about reinforcement.

Journal of the experimental analysis of behavior, 1980 · doi:10.1901/jeab.1980.33-3