Evaluating the Efficacy, Preference, and Cultural Responsiveness of Student-Generated Content in an Undergraduate Behavioral Course
Student-made videos make class feel relevant but won’t budge quiz scores.
01Research in Context
What this study did
Nava and team asked 24 undergrads in an ABA course to make short videos that showed behavioral principles in daily life. The teacher added these student clips to the usual slide lectures.
The researchers then compared quiz scores and student feelings across two units: one with peer videos and one with only textbook examples.
What they found
Quiz scores stayed flat. The peer videos did not raise test performance at all.
Yet students liked the peer clips far more. They called them "relatable" and "culturally on point."
How this fits with other research
Whiting et al. (2025) also gave college students a small choice—picking next week’s topic with an anonymous poll. Like Nava, they saw no jump in grades but a a large share lift in optional class attendance. Both studies show that low-effort student input boosts engagement, not achievement.
Vinson et al. (2010) let kids name their own quality-of-life domains. Self-generated items felt more meaningful, just as Nava’s self-generated videos felt more culturally responsive. The pattern repeats: people value what they create.
Cacciani et al. (2013) warn that peer tactics can look good yet still fail evidence-based criteria. Nava’s null quiz data echo that caution—likability alone does not equal learning.
Why it matters
If you teach RBT coursework or run staff trainings, add a five-minute “film your example” assignment. You may not raise test scores, but you will lift buy-in and cultural fit—two variables that keep adult learners in their seats.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Ask one learner to record a 30-second clip of a target behavior this week and share it with the class.
02At a glance
03Original abstract
Increasing diversity in the field of behavior analysis may begin with an evaluation of culturally responsive practices in the college classroom. This study leveraged the various backgrounds of students in a university nationally recognized for diversity to evaluate the effects of peer-generated course materials on student performance in an undergraduate behavior analysis course. First, graduate students created multimedia examples (videos, pictures) of the behavioral principles in their everyday lives. Next, we curated an online bank of these examples corresponding to 4 topics (respondent conditioning, reinforcement, antecedent control, extinction and punishment) taught in an undergraduate behavior analysis course. We used a multiple-probe and between-group design to evaluate the effects of these peer-generated materials as supplements to traditional instruction. Students showed evidence of concept acquisition on all topics. However, results showed that peer-generated examples, as supplements to textbook and lectures, did not enhance students’ performance on knowledge assessments but were rated by students as more preferred, culturally responsive, and diverse than textbook examples.
Behavior Analysis in Practice, 2019 · doi:10.1007/s40617-019-00344-7