ABA Fundamentals

Decision rules and signal detectability in a reinforcement-density discrimination.

Commons (1979) · Journal of the experimental analysis of behavior 1979
★ The Verdict

Pigeons act like tiny statisticians, picking the better payoff as long as the schedule feels random and the cues are clear.

✓ Read this if BCBAs who mix reinforcement rates within a session.
✗ Skip if Clinicians using pure FI or FR chains with no overlap.

01Research in Context

01

What this study did

Wetherington (1979) worked with four pigeons in a lab.

The birds chose between two keys that gave food at different rates.

Sometimes the rates were close, like 45 vs 55 percent, so the choice was hard.

02

What they found

The pigeons picked the richer key almost every time, even when the difference was tiny.

They also shifted their picking right away when pellet size or timing changed.

03

How this fits with other research

Clark et al. (1970) saw the opposite: pigeons moved toward the leaner schedule when the task used fixed, predictable timing.

The studies differ because L used random, mixed schedules while B used steady, clock-like ones.

Davison et al. (1991) later showed birds only notice local rate changes if they happen within 30 seconds, backing L’s finding that timing matters.

Attwood et al. (1988) added that birds still track overall payoff when effort keeps rising, confirming the broad rule: choice follows the math, but only when the signals are clear.

04

Why it matters

Your client’s world is noisy, just like the pigeons’. If schedules look alike, highlight the difference with salient cues and keep the contingency window short. Check that richer schedules really feel richer, or the learner may drift to the easier-looking one.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Add a brief, bright cue to the richer schedule and keep each bout under 30 s so the learner notices the difference.

02At a glance

Intervention
other
Design
single case other
Sample size
4
Population
not specified
Finding
strongly positive
Magnitude
very large

03Original abstract

Two probabilistic schedules of reinforcement, one richer in reinforcement, the other leaner, were overlapping stimuli to be discriminated in a choice situation. One of two schedules was in effect for 12 seconds. Then, during a 6-second choice period, the first left-key peck was reinforced if the richer schedule had been in effect, and the first right-key peck was reinforced if the leaner schedule had been in effect. The two schedule stimuli may be viewed as two binomial distributions of the number of reinforcement opportunities. Each schedule yielded different frequencies of 16 substimuli. Each substimulus had a particular type of outcome pattern for the 12 seconds during which a schedule was in effect, and consisted of four consecutive light-cued 3-second T-cycles, each having 0 or 1 reinforced center-key pecks. Substimuli therefore contained 0 to 4 reinforcers. On any 3-second cycle, the first center-key peck darkened that key and was reinforced with probability .75 or .25 in the richer or leaner schedules, respectively. In terms of the theory of signal detection, detectability neared the maximum possible d' for all four pigeons. Left-key peck probability increased when number of reinforcers in a substimulus increased, when these occurred closer to choice, or when pellets were larger for correct left-key pecks than for correct right-key pecks. Averaged over different temporal patterns of reinforcement in a substimulus, substimuli with the same number of reinforcers produced choice probabilities that matched relative expected payoff rather than maximized one alternative.

Journal of the experimental analysis of behavior, 1979 · doi:10.1901/jeab.1979.32-101