Choosing among multiple alternatives: Relative and overall reinforcer rates
With three or four options the richest one captures the first response, so two-choice matching rules no longer predict initial moves.
01Research in Context
What this study did
Beeby et al. (2017) let pigeons peck three or four keys at once. Each key paid off on its own schedule.
The birds could switch any time. The team watched which key got the first peck after food.
What they found
The richest key almost always won the first peck, even if it had just paid off.
Two-key rules did not apply. More choices changed the game.
How this fits with other research
PLISKOFF (1963) showed pigeons match response ratios to payoff ratios with two or three keys. Beeby adds: once you pass two keys, the first move is grabbed by the richest option, not by strict matching.
Landon et al. (2002) saw both quick and slow shifts after each payoff. Beeby zooms in on the very first shift and finds it is locked onto the richest key.
Beeby et al. (2017) also ran the same birds with changeover responses removed. Preference sometimes flipped, showing the raw count you see on your data sheet can hide the real pattern.
Why it matters
When you give a client three leisure items or three work bins, the richest one will pull the first response. Do not assume two-choice matching rules still hold. Watch where the learner goes right after reinforcement; that first move tells you which option is strongest. If you want to balance time across tasks, you may need to enrich the leaner choices or add changeover delays to level the field.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Track which option your client touches first after each reinforcer; if one item always wins, rebalance the payoffs or insert a brief delay before that item can be chosen again.
02At a glance
03Original abstract
Choice behavior among two alternatives has been widely researched, but fewer studies have examined the effect of multiple (more than two) alternatives on choice. Two experiments investigated whether changing the overall reinforcer rate affected preference among three and four concurrently scheduled alternatives. Experiment 1 trained six pigeons on concurrent schedules with three alternatives available simultaneously. These alternatives arranged reinforcers in a ratio of 9:3:1 with the configuration counterbalanced across pigeons. The overall rate of reinforcement was varied across conditions. Preference between the pair of keys arranging the 9:3 reinforcer ratio was less extreme than the pair arranging the 3:1 reinforcer ratio regardless of overall reinforcer rate. This difference was attributable to the richer alternative receiving fewer responses per reinforcer than the other alternatives. Experiment 2 trained pigeons on concurrent schedules with four alternatives available simultaneously. These alternatives arranged reinforcers in a ratio of 8:4:2:1, and the overall reinforcer rate was varied. Next, two of the alternatives were put into extinction and the random interval duration was changed from 60 s to 5 s. The ratio of absolute response rates was independent of interval length across all conditions. In both experiments, an analysis of sequences of visits following each reinforcer showed that the pigeons typically made their first response to the richer alternative irrespective of which alternative was just reinforced. Performance on these three- and four-alternative concurrent schedules is not easily extrapolated from corresponding research using two-alternative concurrent schedules.
Journal of the Experimental Analysis of Behavior, 2017 · doi:10.1002/jeab.269