Blending Active Student Responding with Online Instruction to Evaluate Response Accuracy and Student Engagement
Swap response cards for typed answers in Zoom class to boost next-day quiz scores.
01Research in Context
What this study did
Hollins and team ran an online graduate class. They switched back and forth each lecture. Some days students typed short answers on their laptops. Other days they held up printed response cards to the camera.
The teacher asked the same quiz questions right after class and again on the big test later. They wanted to see which method gave better quiz scores.
What they found
Typed answers won. Students scored higher on the next-day quiz when they had typed during class. The edge stuck around on the final exam too.
Response cards still helped, just not as much. Writing beat holding paper up to a webcam.
How this fits with other research
Lovitt et al. (1970) saw the same lift fifty years ago. Their paper-based programmed book with built-in checks also beat plain lectures. The tool changed, the benefit stayed.
Ruiz (1998) warned us that college behavior-analysis courses rarely check what really works. Hollins answers that call with hard data from an online room.
Zonneveld et al. (2024) used the same flip-back design to compare live versus video Excel demos. Both studies show the power of simply testing formats head-to-head.
Why it matters
If you teach online, ask for typed answers during the slide. It takes zero tech and lifts quiz scores. Drop the printed cards and use the chat box instead. Your students keep the gain when the big test comes.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Open the Zoom chat and ask for a one-sentence answer after each key point.
02At a glance
03Original abstract
High rates of active student responding and opportunities to respond are considered best-practice instructional strategies for learning. Many educators in higher education have shifted from teaching primarily in-person to either a hybrid or an online format over the past decade. The global pandemic hastened further shifts from in-person to online learning for many institutions of higher education. Given this rapid shift to online instruction, it is critical to evaluate evidence-based teaching practices in online formats. Although there is a robust body of literature that supports the effectiveness of embedding opportunities to respond and active student responding during in-person instruction, to date, there is limited research that evaluates the effects of increased opportunities to respond during synchronous online courses in post-secondary settings. Using an alternating treatments design, this study evaluated the effects of two active student response modalities on response accuracy for seventeen students enrolled in a synchronous online graduate course. The results suggest that students performed more accurately on post-lecture queries following conditions that required written active student responses compared to responds cards. Moreover, the accuracy of correct responding maintained across the exams and the cumulative final exam. Limitations and future implications are discussed.
Journal of Behavioral Education, 2024 · doi:10.1007/s10864-022-09499-w