Practitioner Development

Effects of computer‐aided instruction on the implementation of the MSWO stimulus preference assessment

Arnal Wishnowski et al. (2018) · Behavioral Interventions 2018
★ The Verdict

A three-minute video and a one-page manual train staff to run MSWO assessments at 90% accuracy with no live coach.

✓ Read this if BCBAs who train staff or students to give MSWO preference assessments in clinics, schools, or home programs.
✗ Skip if Practitioners who already have in-vivo BST systems that work and enjoy giving hands-on feedback.

01Research in Context

01

What this study did

Arnal Wishnowski et al. (2018) built a short online course. It had a PDF manual and a three-minute video that showed every step of the MSWO preference assessment.

They asked college students and agency staff to watch the clip and read the packet. No coach stood beside them. The team then scored how well each learner set out items, gave the instruction, and recorded choices.

02

What they found

After the self-instruction package, every learner hit 90% accuracy or higher on the first try. Scores stayed high when the researchers checked again weeks later.

Both students and seasoned staff needed the same short package. No extra feedback or role-play was required.

03

How this fits with other research

Griffith et al. (2020) used the same recipe—manual plus video—to teach trial-based functional analyses. Their undergraduates also reached mastery, showing the format works across different ABA skills.

van Vonderen et al. (2010) added live video feedback after in-person instruction. They saw big gains too, but they needed a real trainer and extra time. The 2018 study proves you can skip that step if the video model is clear enough.

Petscher et al. (2006) relied on in-person prompts and self-monitoring to keep staff on track with token boards. Their package worked, yet it demanded daily supervisor time. The online-only route in Arnal Wishnowski et al. (2018) cuts that ongoing cost.

04

Why it matters

You can email the package today and have confident MSWO results tomorrow. Use it during new-hire orientation, practicum courses, or when you open a new site. One three-minute video replaces hours of live training while keeping fidelity above 90%.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Email the free video link and one-page MSWO checklist to every new RBT before their first session.

02At a glance

Intervention
behavioral skills training
Design
multiple baseline across participants
Sample size
10
Population
not specified
Finding
positive
Magnitude
large

03Original abstract

This study evaluated a self‐instructional online training package to teach students and staff to conduct a stimulus preference assessment using the multiple‐stimulus without replacement procedure. The training package included a self‐instructional manual and video modeling and was delivered online. Training was evaluated using a multiple‐probe design across a total of six university students and four staff members. Overall, students improved from a mean of 35% correct in baseline to a mean of 94% correct following training, and staff improved from a mean of 23% correct in baseline to a mean of 87% correct following training. During retention and generalization simulated assessments conducted from 7 to 17 days following training, all participants performed considerably above baseline. The online delivery of the self‐instructional manual plus video modeling has tremendous potential for providing an effective method for teaching individuals to conduct stimulus preference assessments without face‐to‐face instruction.

Behavioral Interventions, 2018 · doi:10.1002/bin.1508