On conditioned reinforcing effects of negative discriminative stimuli.
Even a "no" signal can reinforce watching or asking if it helps the learner predict what happens next.
01Research in Context
What this study did
Pigeons pecked a key for food. Sometimes a red light came on. Red meant no food would follow.
The birds could also peck a second key. This key only showed the red light. The team asked: will birds work just to see the bad news?
What they found
The birds kept pecking the news key even though it only showed the red no-food light.
Their pecks stayed strong when ignoring the red light later made food even scarcer. The red light itself was enough to keep the observing response alive.
How this fits with other research
Harrison et al. (1959) first showed that turning noise on or off can reinforce behavior. Allen et al. (1989) extends that idea: even a signal for loss can reinforce if it helps the animal cope.
Weisman et al. (1975) found pigeons peck to turn the lights off during extinction. Both studies use single-case lab designs and show responses in extinction are held up by their consequences. The 1989 paper adds that the consequence can be pure information, not a change in the room.
Crane et al. (2008) later showed that when the top cue is extinguished, weaker cues regain control. Together these papers paint a clear picture: stimuli ignored during training can still control behavior if the payoff is right.
Why it matters
Your client may keep looking at or asking about the "no" card even after you stop showing it. That glance is not naughty; it is reinforced by the information it gives. Instead of blocking the look, give a quick clear answer and move on. You will reduce the strength of the observing response faster than if you try to suppress it.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →When you remove a reinforcer, give one brief clear cue then redirect; do not let the client keep scanning for clues.
02At a glance
03Original abstract
Observing responses by pigeons were studied during sessions in which a food key and an observing key were available continuously. A variable-interval schedule and extinction alternated randomly on the food key. In one condition, food-key pecking during extinction decreased reinforcement frequency during the next variable-interval component, and in the other condition such pecking did not affect reinforcement frequency. Observing responses either changed both keylight colors from white to green (S+) or to red (S-) depending on the condition on the food key, or the observing responses never produced the S+ but produced the S- when extinction was in effect on the food key. Observing responses that produced only S- were maintained only when food-key pecking during extinction decreased reinforcement frequency in the subsequent variable-interval component. The red light conformed to conventional definitions of a negative discriminative stimulus, rendering results counter to previous findings that production of S- alone does not maintain observing. Rather than offering support for an informational account of conditioned reinforcement, the results are discussed in terms of a molar analysis to account for how stimuli acquire response-maintaining properties.
Journal of the experimental analysis of behavior, 1989 · doi:10.1901/jeab.1989.52-335