Strengthening scientific verbal behavior: an experimental comparison of progressively prompted and unprompted programmed instruction and prose tutorials.
Interactive computer lessons with fading hints teach ABA vocabulary better than reading alone.
01Research in Context
What this study did
College students learned ABA terms with two computer lessons. One lesson gave hints that slowly faded. The other gave no hints.
Students wrote essays before and after each lesson. Researchers counted how many correct behavior terms they used.
What they found
Both computer lessons worked well. Students used more correct ABA words in their essays after either lesson.
The gains stuck when students wrote new essays in a different room, showing real learning, not just memorizing.
How this fits with other research
Rojahn et al. (1994) showed that adding a 10-second pause after each question boosts scores. Watson et al. (2007) kept that pause and added fading hints, building on the earlier trick.
Mailey et al. (2021) and Rosales et al. (2018) later used the same fading-hint idea to train staff to run PECS. They moved the method from college students to direct-care workers.
Downing et al. (1976) also found that a prompting textbook beat plain reading for concept learning, echoing the same theme: active prompts beat passive text.
Why it matters
If you teach ABA vocabulary, swap your reading packets for interactive computer frames. You do not need live lectures. Students will write clearer, more precise behavior plans. The same prompting logic now trains staff in half the time, so you can adopt it for both university courses and in-house staff training.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Add a free online quiz with hints that fade to your next training slide deck.
02At a glance
03Original abstract
Web-based software was used to deliver and record the effects of programmed instruction that progressively added formal prompts until attempts were successful, programmed instruction with one attempt, and prose tutorials. Error-contingent progressive prompting took significantly longer than programmed instruction and prose. Both forms of programmed instruction substantially increased the appropriate use of behavioral vocabulary during subsequent interpretive essays. These behavioral gains extended to a different setting, suggesting that more was being learned than simply how to answer programmed tutorial frames correctly.
Journal of applied behavior analysis, 2007 · doi:10.1901/jaba.2007.93-05