Behavior-dependent reinforcer-rate changes in concurrent schedules: A further analysis.
Pigeons only shift choice when payoff changes are both behavior-dependent and fast—keep contingency windows under 30 seconds.
01Research in Context
What this study did
The team worked with pigeons on two side-by-side keys. Each key paid off on its own variable-interval schedule. Sometimes the payoff rate on one key suddenly changed.
The catch was timing. In one condition the rate shift happened right after the bird switched keys. In another, the same shift came no matter what the bird did. The researchers watched how the birds split their pecks.
What they found
Birds only adjusted their choice when the payoff change was tied to their own behavior and happened within 30 seconds. If the shift came later, or was not tied to their switch, they kept pecking as before.
In short, pigeons notice local payoff swings only if the contingency is obvious and quick.
How this fits with other research
Wetherington (1979) showed pigeons can almost perfectly tell two payoff densities apart. The new study adds a time rule: the density shift must fall inside a 30-second window or the bird acts as if nothing happened.
HERRNSTEISLOANE (1964) found birds prefer variable over fixed intervals. M et al. now say the preference can flip if the local rate change is too slow to detect.
Kendall (1974) proved signaled intermittent payoff can beat 100 % payoff. M et al. agree: timing and signaling together drive choice.
Why it matters
When you set up concurrent teaching stations, keep the payoff change tight and response-based. If a child earns more tokens for staying at the math table, deliver the extra token right after the stay, not minutes later. A short, clear window helps the learner feel the contingency and keeps the behavior in place.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Time your differential reinforcement—deliver the richer payoff within 30 seconds of the target response.
02At a glance
03Original abstract
Six pigeons were trained on concurrent variable-interval schedules in three different procedures. The first procedure was a standard concurrent schedule, and the relative reinforcer frequency for responding was varied. The second was a schedule in which a relative left-key response rate (over a fixed period of time) exceeding .75 produced, in the next identical time period a higher reinforcer rate on the right key. If this criterion was not exceeded, equal reinforcer rates were arranged on the two keys in this period. This was the dependent procedure. In the third (independent) procedure, the periods of higher right-key reinforcer rates occurred with the same probability as in the second procedure, but occurred independently of behavior. In the second and third procedures, the fixed-time period (window) was varied from 5 s to 60 s, and to 240 s in the second procedure only. Performance on the two keys was similar in the concurrent and independent procedures. The procedure used in the dependent conditions generally affected performance when the windows were shorter than about 30 s. Models of performance that assume that subjects do not discriminate changes in local relative reinforcer rates cannot account for the data. Moreover, existing models are inherently unable to account for the effects of contingencies of reinforcement between responding on one alternative and gaining reinforcers on another that are arranged or that emerge as a result of time allocated to alternative schedules. Undermatching on concurrent variable-interval schedules may result from such emergent contingencies.
Journal of the experimental analysis of behavior, 1991 · doi:10.1901/jeab.1991.56-1