Choice and behavioral patterning.
Reinforcement can sculpt how long a response run lasts, giving you a lever to speed up or slow down whole behavioral chains.
01Research in Context
What this study did
Julià (1982) worked with pigeons in a lab.
Birds could peck two keys. One key paid off only after short runs of pecks. The other key paid off only after longer runs.
The team changed how often each key paid and how long the bird had to wait between rounds. They watched which run length the bird chose.
What they found
The birds shifted their pecking toward the run length that was paying.
When the wait between rounds grew, birds liked the shorter runs even more.
The study showed that reinforcement can mold whole strings of responses, not just single pecks.
How this fits with other research
Byrd (1980) had already shown that pifferent schedules make pigeons peck at different speeds. Julià (1982) moves that idea forward by showing schedules can also sculpt the length of a whole run.
Hall (1992) later asked pigeons to vary their four-peck sequences. Birds produced near-random order when the payoff asked for variety. Together with Julià (1982), this tells us reinforcement can control both run length and run variety.
Charlop et al. (1985) took the same question to college students. Most learners matched a target timing pattern within 30 min. The pigeon data and the human data line up: temporal patterning is conditionable across species.
Why it matters
If you shape behavior, think in patterns, not just single responses. A child who rocks three times before asking, or a client who taps twice before pointing, is showing a run. You can reinforce shorter runs to speed transitions, or longer runs to build persistence. Try reinforcing the fifth correct response instead of every response and watch the whole burst change.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Pick a short chain the client already does and reinforce only the third instance to see the run length shift.
02At a glance
03Original abstract
Ten pigeons pecked left and right keys in a discrete-trials experiment in which access to food was contingent upon changeovers to the right key after particular runs of left-key pecks. In each of three sets of conditions, two run lengths were reinforced according to a concurrent variable-interval schedule: reinforcement followed runs of either 1 or 2, 1 or 4, or 2 or 4 left-key pecks preceding changeovers. The intertrial interval separating successive pecks was varied from .5 to 10.0 sec, and the relative frequency of reinforcement for the shorter of the two reinforced runs was varied from 0 to .75. The contingencies established local behavioral patterning that roughly approximated that required for reinforcement. For a fixed pair of reinforced run lengths, preference for the shorter of the two frequently increased as the intertrial interval increased and therefore as the minimum temporal durations of both reinforced runs increased. Preference for the shorter of the two also increased as its corresponding relative frequency of reinforcement increased. Both of these effects on preference were qualitatively similar to corresponding effects in previous research with two different kinds of reinforced behavioral patterns, interresponse times and interchangeover times. In all these experiments, analytical units were found in the temporal patterns of behavior, not in the behavior immediately contiguous with a reinforcer. It is suggested that a particular local temporal pattern of behavior is established to the extent to which it is repeatedly remembered when reinforcers are delivered, regardless of whether the delivery of a reinforcer is explicitly contingent upon that pattern.
Journal of the experimental analysis of behavior, 1982 · doi:10.1901/jeab.1982.37-157