Assessment & Research

Cumulative Instructional Time and Relative Effectiveness Conclusions: Extending Research on Response Intervals, Learning, and Measurement Scale

Black et al. (2016) · Behavior Analysis in Practice 2016
★ The Verdict

Track cumulative instructional seconds, not just sessions—shorter response intervals can accelerate learning when time is the metric.

✓ Read this if BCBAs who run flashcard or DTT programs and want clearer progress pictures.
✗ Skip if Clinicians who already log every second and graph by cumulative time.

01Research in Context

01

What this study did

The team ran computer flashcard reading with one kid. They tried three speeds: 1-second, 3-second, or 5-second response windows.

An alternating-treatments design switched the speed every few minutes. Data were plotted two ways: by session count and by total seconds spent.

02

What they found

When the graph used total seconds, the 1-second window made the steepest learning line.

When the graph used sessions, all three speeds looked the same. The scale you pick changes what looks "best."

Bottom line: short intervals can win, but only if you watch the clock, not the calendar.

03

How this fits with other research

Cariveau et al. (2016) also found 2-second beats work fastest in DTT with autistic kids. Both studies say "go short," but the target adds "count seconds, not sessions."

Sanford et al. (1980) showed 1-second inter-trial gaps beat 4-second ones for autistic children decades ago. Black et al. extend that idea to flashcards and show the metric you use decides the winner.

Wilson et al. (1973) warned that picking the wrong measure hides effects. Black et al. prove it: session counts masked the 1-second edge that cumulative seconds revealed.

04

Why it matters

Stop logging only "session done." Start a stopwatch and add up real instructional seconds. If you do, you may find that quicker response windows give you more learning per minute. Try 1-second or 2-second intervals next time you run flashcards or DTT, and graph the data both ways to see which picture tells the truth.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Time every teaching trial this week, then re-plot last month’s data by total seconds—see if the fastest condition changes.

02At a glance

Intervention
other
Design
alternating treatments
Sample size
2
Population
not specified
Finding
mixed

03Original abstract

Adapted alternating treatments designs were used to evaluate three computer-based flashcard reading interventions (1-s, 3-s, or 5-s response intervals) across two students with disabilities. When learning was plotted with cumulative instructional sessions on the horizontal axis, the session-series graphs suggest that the interventions were similarly effective. When the same data were plotted as a function of cumulative instructional seconds, time-series graphs suggest that the 1-s intervention caused the most rapid learning for one student. Discussion focuses on applied implications of comparative effectiveness studies and why measures of cumulative instructional time are needed to identify the most effective intervention(s).Comparative effectiveness studies may not identify the intervention which causes the most rapid learning.Session-series repeated measures are not the same as time-series repeated measures.Measuring the time students spend in each intervention (i.e., cumulative instructional seconds) allows practitioners to identify interventions that enhance learning most rapidly.Student time spent working under interventions is critical for drawing applied conclusions. Comparative effectiveness studies may not identify the intervention which causes the most rapid learning. Session-series repeated measures are not the same as time-series repeated measures. Measuring the time students spend in each intervention (i.e., cumulative instructional seconds) allows practitioners to identify interventions that enhance learning most rapidly. Student time spent working under interventions is critical for drawing applied conclusions.

Behavior Analysis in Practice, 2016 · doi:10.1007/s40617-016-0114-3