The introductory class in higher education: Some old problems and new considerations.
We still lack proof that any college behavior-analysis course works—track student mastery before you change formats.
01Research in Context
What this study did
Ruiz (1998) looked at how we teach behavior analysis in college. The paper says most courses are built on tradition, not data. It warns we do not measure whether students actually learn the concepts.
The author checked syllabi and found few final exams or follow-up tests. Without those numbers, we cannot tell if lecture, PSI, or online works best.
What they found
The review found no solid evidence that any college format teaches behavior analysis better. Schools keep changing methods without tracking exam scores or long-term retention.
Ruiz (1998) argues we should test final knowledge and check again months later before we switch to a new style.
How this fits with other research
McIntyre et al. (2002) answered the call. They tracked a computer-managed PSI class and got a large share accurate peer feedback. The study gives the hard numbers Ruiz (1998) said were missing.
Moore (2022) extends the idea further. It lays out a full 15-week graduate syllabus rooted in radical behaviorism. The course blueprint shows how to put R’s advice into action.
Moss et al. (2009) and Yaw et al. (2014) look at staff training after college. Both used feedback and measured real gains. Their data-heavy approach mirrors what Ruiz (1998) wants for university courses.
Why it matters
Before you flip your next class, give a pre-test, a final exam, and a three-month follow-up. One extra page of data tells you if the new format beats the old one. That small step turns tradition into evidence-based teaching.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Add a 20-item concept quiz at the start and end of your next unit; keep the sheet for comparison.
02At a glance
03Original abstract
The traditional text and lecture format has remained the norm in introductory college courses despite evidence, such as the personalized system of instruction, that other practices could improve what is learned and retained. The growth of distance learning provides new opportunities to implement effective teaching practices. Unfortunately, an adequate comparison of various teaching practices is not possible (even with regard to the teaching of behavior analysis). Few practices have been assessed with respect to course-end effects (final exam performance), and longer term effects remain almost wholly unexplored. Studies of the retention of academic materials, the practice required for mastery or fluency, and the relation between verbal repertoires and correspondent everyday behaviors suggest course outcomes more modest than those hoped for earlier. Suggestions for changing current practices have little credibility until the size of the gap between present and possible learning outcomes is much better known, and the personal, social, and economic consequences of bridging that gap (or not) are assessed.
The Behavior analyst, 1998 · doi:10.1007/BF03391968