Practitioner Development

Comparing choice and questionnaire measures of the acceptability of a staff training procedure.

Reid et al. (1995) · Journal of applied behavior analysis 1995
★ The Verdict

A quick choice trial reveals staff preferences that surveys miss.

✓ Read this if BCBAs who train staff or introduce new monitoring tools
✗ Skip if Clinicians only working directly with clients and never supervising staff

01Research in Context

01

What this study did

The authors asked 24 staff to rate two ways of being watched at work. One way was the familiar system they already used. The other was a new system they had never tried.

Each staff member filled out a short survey. Then they picked which system they would rather keep. The study compared the survey answers with the real choice.

02

What they found

The surveys said both systems were equally fine. The choice test told a different story. Staff picked the familiar system 92 percent of the time.

In short, the survey hid a strong real-world preference.

03

How this fits with other research

Morris et al. (2024) later showed the same gap with clients. People often like a treatment that is not the most effective. The 1995 paper first spotted this problem with staff.

Aquino et al. (2024) repeated the pattern in college students. They liked interactive computer training but learned no more than with other formats. Again, liking did not equal learning.

Radley et al. (2019) found a faster way to spot true preference. Group Plickers gave the same clear rankings as one-on-one trials. All four studies agree: choice measures beat surveys at showing what people really want.

04

Why it matters

Before you roll out a new data sheet, camera, or feedback form, give staff a 30-second choice between the old and the new. A tiny side-by-side trial will tell you more than a five-question survey. Save the paperwork for later and let their pick guide the rollout.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Put the old and new data sheets on the desk and ask, 'Which one do you want to use today?' Tally the picks before the shift starts.

02At a glance

Intervention
not applicable
Design
single case other
Population
not specified
Finding
mixed

03Original abstract

We compared questionnaire and choice measures of acceptability while evaluating effects of staff familiarity versus unfamiliarity with the system used to monitor performance during a training program. Staff members rated both monitoring formats equally favorably on the questionnaire, whereas when given a choice, they frequently chose the familiar format and never chose the unfamiliar format. These results suggest that traditional questionnaire evaluations may not be sufficiently sensitive measures of acceptability relative to choice measures.

Journal of applied behavior analysis, 1995 · doi:10.1901/jaba.1995.28-95