Comparing instructor‐led, video‐model, and no‐instruction control tutorials for creating single‐subject graphs in Microsoft Excel: A systematic replication and extension
Skip the handout—use a short video or live demo to teach Excel graphing; both beat giving trainees a written list of steps.
01Research in Context
What this study did
Zonneveld et al. (2024) tested three ways to teach Excel graphing to college students. One group got a live demo. One group watched a short video. A third group only got a written list of steps.
All students then tried to make single-subject graphs. The researchers scored how many steps they did right.
What they found
Both the live demo and the video beat the written list. Students who saw or watched made far fewer errors.
The best part: the skills carried over. When asked to graph a new style, the video group still outperformed the list group.
How this fits with other research
Geiger et al. (2018) saw the same pattern when they taught DTT. Live BST scored a hair higher than computer lessons, but both crushed a text-only control.
Carr et al. (1985) seems to disagree. Their training manual worked as well as live coaching for teaching interview skills. The key difference: the 1985 manual had built-in checks and practice. The 2024 study only gave a plain list of steps.
Morante et al. (2024) adds that video feedback alone can bring adults to 100% correct form on a running task. Put together, the picture is clear: passive paper equals poor results; pictures or people equal better results.
Why it matters
If you train staff or students to graph, stop handing out step sheets. Record a three-minute screen-capture video or run a five-minute demo. Either choice saves time and lifts accuracy on the first try.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Film your screen while you build one graph, narrate the steps, and send the clip to your trainees.
02At a glance
03Original abstract
Visual inspection of single-subject data is the primary method for behavior analysts to interpret the effect of an independent variable on a dependent variable; however, there is no consensus on the most suitable method for teaching graph construction for single-subject designs. We systematically replicated and extended Tyner and Fienup (2015) using a repeated-measures between-subjects design to compare the effects of instructor-led, video-model, and no-instruction control tutorials on the graphing performance of 81 master's students with some reported Microsoft Excel experience. Our mixed-design analysis revealed a statistically significant main effect of pretest, tutorial, and posttest submissions for each tutorial group and a nonsignificant main effect of tutorial group. Tutorial group significantly interacted with submissions, suggesting that both instructor-led and video-model tutorials may be superior to providing graduate students with a written list of graphing conventions (i.e., control condition). Finally, training influenced performance on an untrained graph type (multielement) for all tutorial groups.
Journal of Applied Behavior Analysis, 2024 · doi:10.1002/jaba.1053