The effect of multiple S-delta periods on responding on a fixed-interval schedule. V. Effect of periods of complete darkness and of occasional omissions of food presentations.
Even when lights go out or food is skipped, the fixed-interval clock keeps ticking and behavior stays on schedule.
01Research in Context
What this study did
Dews (1966) worked with pigeons on a fixed-interval schedule.
The birds pecked for food every few minutes.
The room sometimes went dark and food sometimes failed to appear.
The team asked: do these shocks kill the birds’ sense of time?
What they found
The pigeons kept the same scallop-shaped response curve.
Darkness and missed snacks did not break the clock.
Time itself, not lights or food, controlled when they pecked.
How this fits with other research
Catania et al. (1974) later showed that blackout can shorten the post-reinforcement pause at very long intervals.
This looks like a clash, but the pause still happens; it is just a bit shorter, so both papers agree that temporal control survives.
McSweeney et al. (1993) built a math rule—linear waiting—showing the pause tracks the last food-to-food gap, a direct child of B’s finding.
Bauman et al. (1996) stretched the rule further, proving rats can learn progressive intervals that keep getting longer, again with time in charge.
Why it matters
Your client’s world can flicker—lights off, reinforcers missed, staff late.
Dews (1966) says the schedule still teaches waiting if the interval stays steady.
Trust the clock; keep the interval, and the learner’s behavior will scallop even when the room feels chaotic.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Keep the FI length the same for one target behavior all week, even if you must skip a treat or dim the lights, and watch the scallop hold.
02At a glance
03Original abstract
In pigeons under fixed-interval schedules of reinforcement, responding during most of the interval can be suppressed by stimulus conditions never present when a response is promptly followed by reinforcing stimuli. When the external stimuli obtaining immediately before reinforcement are presented during brief probe periods in the course of the interval, the rate of responding in the probe depends on the temporal position of the probe during the interval; the rate of responding is lower during a probe early in the interval than during one late in the interval. The present experiments show that the temporal dependency still holds (1) in birds with no experience under unmodified fixed-interval schedules, (2) when the time between probes is spent in complete darkness, and (3) when food presentations are omitted at the end of 50% of intervals. The results strengthen and extend the conclusion from previous studies that the time relations themselves are the primary control of rate of responding under fixed-interval schedules of reinforcement.
Journal of the experimental analysis of behavior, 1966 · doi:10.1901/jeab.1966.9-573