Practitioner Development

Evaluating the Efficacy of and Preference for Interactive Computer Training with Student-Generated Examples

Aquino et al. (2024) · Behavior Analysis in Practice 2024
★ The Verdict

Students love making their own examples on screen, but the fun alone does not raise test scores.

✓ Read this if BCBAs teaching college or high-school behavior-analysis classes
✗ Skip if Clinicians looking for computer programs that directly boost skill accuracy

01Research in Context

01

What this study did

Aquino et al. (2024) tested a new computer lesson. Students made their own examples and shared them on screen.

The team asked: Do students like this more than plain videos or text? Does it help them score higher on quizzes?

They ran a single-case study with college students learning behavior-analysis terms.

02

What they found

Students picked the interactive module every time. They said it felt fun and personal.

Yet their quiz scores stayed flat. The fancy format did not beat video or text on learning gains.

03

How this fits with other research

Herzog et al. (2026) saw the same split. Kids liked computer math games, but brighter ones learned slower because the tasks were too easy for them.

Radley et al. (2019) showed that quick group polls match longer 1-to-1 preference tests. Aquino’s team used a short digital poll and got the same clear winner.

Wang et al. (2025) reviewed 15 studies of computerized cognitive training for youth with ASD. They found real skill gains. Aquino’s null score result looks like a contradiction, but the earlier work targeted different skills and added longer practice blocks.

04

Why it matters

Liking matters. If students hate the format, they drop out. Use interactive, student-made examples to keep them in their seats. Just do not expect higher test marks unless you add more practice and feedback loops. Pair the fun module with brief quizzes or fluency drills to turn preference into performance.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Keep the interactive student-made slides, then add three quick practice questions after each chunk.

02At a glance

Intervention
other
Design
single case other
Population
neurotypical
Finding
mixed

03Original abstract

Designing effective and preferred teaching practices for undergraduate students are common goals in behavior analytic training programs. A preliminary study by Nava et al. (2019) showed that undergraduate students generally rated peer-generated examples of the principles of behavior analysis as more preferred, relatable, and culturally responsive than traditional textbook examples. However, peer-generated examples did not result in any improvement in performance on concept knowledge assessments. The current study extended the study by Nava et al. by embedding peer-generated examples within interactive computer training (ICT) to provide opportunities for active responding, prompt fading, automated feedback, and practice with examples and nonexamples. Results showed that ICT did not produce reliable improvements in knowledge assessments but were preferred to video examples and textual examples. In addition, students reported that certain interactive features contributed to their preference for ICT. We discuss ways to further improve the efficacy of the preferred ICT package.

Behavior Analysis in Practice, 2024 · doi:10.1007/s40617-024-01007-y