Evaluation of a training manual for the acquisition of behavioral assessment interviewing skills.
A self-paced manual teaches behavioral interviewing just as well as live BST—use it to free up trainer time for harder skills.
01Research in Context
What this study did
Four college students learned how to run a behavioral assessment interview. Half got the usual live BST: watch a model, practice, hear feedback. The other half worked through a self-paced training manual that gave the same steps, checklists, and self-scoring.
The researchers used a multiple-baseline design. They tracked how many interview steps each student got right before, during, and after training.
What they found
Both groups hit 90–100 % correct steps after training. The manual alone worked just as well as the live coach. Students kept the skills when they were tested later.
How this fits with other research
Geiger et al. (2018) repeated the idea 33 years later. They compared live BST to a computer package for teaching DTT. Again, the automated version almost matched live training, with only a small drop in fidelity. The 1985 manual and the 2018 e-learning are conceptual twins: self-paced instruction can replace live BST when a tiny fidelity trade-off is okay.
Moss et al. (2009) pooled 55 staff-training studies. The meta-analysis says the strongest mix is “in-service plus on-the-job coaching with feedback.” That sounds like a contradiction—why does the 1985 paper say a manual alone is enough? The difference is depth. G et al. taught one skill: asking interview questions. The meta-analysis covers full job roles that need generalization across clients and settings. For a narrow skill, a manual suffices. For broad workplace performance, add feedback and coaching.
Romani et al. (2023) and Yaw et al. (2014) show the next step. Each study started with a short lecture or manual, then added brief feedback. Accuracy jumped for progress-note writing and data collection. They extend the 1985 finding: start with a manual to save time, then layer on quick feedback when the skill must hold across varied clients.
Why it matters
You can ship a slim training manual instead of blocking out hours for live BST when you only need one discrete skill—like teaching parents intake questions or showing staff how to score a preference assessment. Save the costly coaching for complex, generalized duties such as running an entire DTT program. Monday morning: write a five-page manual with checklists, have new hires work it alone, then spot-check with two minutes of feedback; you’ll likely get the same fidelity as sitting through full live BST.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Replace your next live BST session on intake interviewing with a printed step-by-step manual plus a five-minute feedback check.
02At a glance
03Original abstract
Two procedures were used to teach behavioral assessment interviewing skills: a training manual and one-to-one instruction that included modeling, rehearsal, and feedback. Two graduate students and two advanced undergraduates were trained with each procedure. Interviewing skills were recorded in simulated assessment interviews conducted by each student across baseline and treatment conditions. Each training procedure was evaluated in a multiple baseline across students design. The results showed that both procedures were effective for training behavioral interviewing skills, with all students reaching a level of 90%-100% correct responding. Finally, a group of experts in behavior analysis rated each interviewing skill as relevant to the conduct of an assessment interview and a group of behavioral clinicians socially validated the outcomes of the two procedures.
Journal of applied behavior analysis, 1985 · doi:10.1901/jaba.1985.18-323