The relation between response rates and reinforcement rates in a multiple schedule.
Response rates lock to reinforcement rates in a predictable ratio, but clear rules can override this link in verbal humans.
01Research in Context
What this study did
Researchers placed pigeons in a chamber with two keys. Each key led to food on its own schedule. The birds could peck both keys during each session.
The team varied how often each key paid off. They recorded peck rates to see if the birds matched their effort to the payoff odds.
What they found
Peck rates lined up with the payoff rates. If the left key paid twice as often, the birds pecked it twice as much. The fit held even when extra food sometimes dropped for free.
The data hugged the straight line predicted by the matching law. Small deviations were just noise.
How this fits with other research
Peters et al. (2013) later asked whether telling human subjects the rule would change this rate-to-rate link. Their college students shifted to delayed DRO only when the rule was spoken aloud. The 1968 pigeons had no rules; pure rates did the job.
Harte et al. (2017) pushed further. They gave humans direct rules, then flipped the payoffs. Rule-following stuck even when rates changed. Again, the 1968 birds tracked rates minute-by-minute, not rules.
Together the three studies draw a line: non-verbal creatures follow the math of rates; verbal humans can override that math with a stated rule.
Why it matters
When you work with clients who can understand language, state the contingency clearly. A rule like 'wait three minutes, then ask' can outmuscle the raw payoff rate. With non-verbal learners, ignore rules and fix the rates. Make the wanted response pay off twice as often and you will see about twice as many responses. Check your data session-by-session; if the ratio drifts, adjust the payoff ratio, not your explanation.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Count the last 50 reinforcers you delivered; if the target response gets 70 %, it should also get 70 % of the responses. If not, shift the payoff ratio today.
02At a glance
03Original abstract
In a multiple schedule, exteroceptive stimuli change when the reinforcement schedule is changed. Each performance in a multiple schedule may be considered concurrent with other behavior. Accordingly, two variable-interval schedules of reinforcement were arranged in a multiple schedule, and a third, common variable-interval schedule was programmed concurrently with each of the first two. A quantitative statement was derived that relates as a ratio the response rates for the first two (multiple) variable-interval schedules. The value of the ratio depends on the rates of reinforcement provided by those schedules and the reinforcement rate provided by the common variable-interval schedule. The following implications of the expression were evaluated in an experiment with pigeons: (a) if the reinforcement rates for the multiple variable-interval schedules are equal, then the ratio of response rates is unity at all reinforcement rates of the common schedule; (b) if the reinforcement rates for the multiple schedules are unequal, then the ratio of response rates increases as the reinforcement rate provided by the common schedule increases; (c) the limit of the ratio is equal to the ratio of the reinforcement rates. Satisfactory confirmation was obtained for the first two implications, but the third was left in doubt.
Journal of the experimental analysis of behavior, 1968 · doi:10.1901/jeab.1968.11-271