Enhancing Pre-service Teachers’ Visual Analysis Skills for Single-Case Graphs: the Role of Trend and Intervention Effect
Trend lines can trick even trained eyes—add statistical aids to your visual analysis routine.
01Research in Context
What this study did
Bosch et al. (2026) gave future teachers a short course on reading single-case graphs. The goal was to help them tell real intervention effects from plain trend lines.
The training used brief lectures, practice graphs, and feedback. Afterward the teachers tried to judge whether changes in the data came from the intervention or just an ongoing slope.
What they found
The teachers left the class more aware that trend can fool the eye. Yet when tested, they still mixed up real effects with steep slopes.
In short, the class boosted knowledge but not accuracy. Trend lines kept winning.
How this fits with other research
Older work saw the same trap. Dykens et al. (1991) showed that teachers change programs based on trend even when the data are flat. Malagodi et al. (1989) added that sparse data make the problem worse.
Those studies warned us; Bosch et al. tried to fix it with training and failed. The gap between knowing and doing is still there.
Wolfe et al. (2018) offer a patch. Their conservative dual-criterion (CDC) method gives a quick statistical check that matched expert eyes. Pairing such aids with visual training may work better than visual training alone.
Why it matters
Your graph may look like a win, but a steep trend can fake it. Add a simple rule like CDC before you call an intervention effective. Teach your RBTs to do the same. One extra step beats a false positive.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Run the CDC rule on your last three graphs and compare the result with your visual call.
02At a glance
03Original abstract
Abstract This study investigates the effectiveness of a visual analysis training intervention for teachers. It provides new insight into the influence of data trends on visual analysis and judgments of single-case graphs. Participants were trained in visual analysis and then asked to judge the effectiveness of interventions based on single-case graphs with varying trend and intervention effects. Results partially replicated previous findings, showing that trend effects can lead to false positives in judging intervention effectiveness. However, the training did not significantly improve participants’ ability to differentiate between trend and intervention effects. While training can raise awareness of trends, it may not fully equip individuals to interpret complex data patterns accurately. These findings have implications for researchers and practitioners using single-case designs, emphasizing the need to consider data trends carefully and combine visual analysis with statistical or heuristic approaches.
Journal of Behavioral Education, 2026 · doi:10.1007/s10864-026-09615-0