Observational learning in monkeys.
Monkeys learned a three-part operant schedule simply by watching, proving observation alone can build complex rule-governed behavior.
01Research in Context
What this study did
Scientists let monkeys watch other monkeys work a three-part schedule. The job was press a lever under fixed-ratio, fixed-interval, and extinction parts.
Observer monkeys sat in clear cages next to trained workers. No food, no prompts, just watching for 113 to 210 hours.
What they found
After only watching, the observers copied the exact schedule pattern. Control monkeys that saw nothing never caught on.
The result shows complex operant skills can be learned by sight alone.
How this fits with other research
FIELPREMACK et al. (1963) first taught chimps the same kind of three-part schedule, but each animal had to work for its own food. Delini-Stula (1970) proves monkeys can skip that step and learn by looking.
Millard (1979) later showed one pigeon's schedule moves can serve as a green light for another bird's pecks. Together the three papers trace a line: first train one, then let others watch, finally use the performer as a living cue.
Leander et al. (1972) shaped monkey calls with simple FR food. Delini-Stula (1970) goes further, showing monkeys can pick up an entire chain of rules without any direct payoff.
Why it matters
Your learners may absorb more than you think just by watching peers. Place a skilled student where others can see the timing of responses, pauses, and stops. Model the whole routine before asking the new client to touch the materials. You could cut physical prompting trials and still get full schedule control.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Seat the new learner beside a proficient peer and let them watch five full cycles before any physical practice.
02At a glance
03Original abstract
Observer monkeys were housed next to demonstrator monkeys that were conditioned to respond on a multiple reinforcement schedule whose components were fixed-ratio 32, variable-interval 3-min, and extinction 5-min followed by an additional 30 sec of extinction during which every response started a new 30-sec interval. After observational periods from 113 to 210 hr long, during which observers could not perform the response and were given no extrinsic reinforcers, their first-response latencies to fixed ratio and variable interval were as short as the demonstrators; and their rates of responding were well above pre-observational baseline levels. About 8 hr later, a temporal pattern of responding appropriate to the multiple schedule emerged, including non-emission of responses during extinction. Controls lacking the chance to observe did not develop typically patterned responding after 60 hr in one case and, in two other cases, after 80 hr during which, on two occasions, every one of 50 responses was reinforced. In a second experiment, the stimulus lights associated with fixed ratio and variable interval were presented simultaneously. Subjects chose one of the schedules by responding to one of the levers beneath the lights. All subjects initially chose fixed ratio. Seeing the demonstrators switch to variable interval, due to increases in the fixed-ratio requirement, had no effect upon observers, which continued to choose fixed ratio.
Journal of the experimental analysis of behavior, 1970 · doi:10.1901/jeab.1970.14-225