Producing mands in concurrent operant environments
When manding is stuck, run a two-choice concurrent setup and test a prompt or richer schedule first—each jump-started requests in about half the kids.
01Research in Context
What this study did
Seaver et al. (2020) tested four quick ways to boost manding when a child has two choices. They set up two buttons or picture cards side-by-side. Each side paid off with snacks or toys on its own timer.
One by one they added a prompt, a richer schedule, a strong motivation operation, or all three. They watched which tweak made each of eight children press or point more.
What they found
Every child learned to ask for at least one new item. Prompts worked for four kids, schedule changes for three, motivation tricks for two. The full package worked for the only child who got it.
No single fix helped everyone, but most kids responded to at least one.
How this fits with other research
Seaver et al. (2023) ran the same two-choice setup with three boys with autism and copied the prompt-plus-schedule win. The 2020 paper did not list diagnoses, so the 2023 study shows the trick still works for kids on the spectrum.
Allen et al. (2016) also aimed for varied mands, but they used colored placemats and a lag schedule instead of concurrent choices. Both teams got new requests, proving you can take different roads to the same goal.
Griffith et al. (2012) taught mands by interrupting a favorite activity and prompting the missing item. That single-chain method and the 2020 two-choice method both rely on smart prompting, giving you two tools for the same job.
Why it matters
If a client stalls at one mand, slide two options in front of them and first try a quick prompt or a thicker reinforcement schedule. About half the time that is enough to see new requests pop out, and you can run the test in any natural setting with minimal prep.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Place two preferred items on separate cards, give a gentle verbal prompt toward one, and set each card on its own VI schedule—count if new mands appear within ten trials.
02At a glance
03Original abstract
This study examined strategies designed to increase the rate of targeted, low-probability mands in a concurrent operant environment. We examined the effects of schedule manipulations, prompt manipulations, motivating operation manipulations, and combined behavioral process manipulations. Increases in target mands were produced for all 8 participants. Schedule manipulations were effective in producing increased rates of targeted mands for 3 of 8 participants, whereas prompt manipulations were effective for 4 of 8 participants. Motivating operation manipulations were effective in producing increased rates of targeted mands for 2 of 8 participants and combined behavioral process manipulations were effective for the single participant exposed to the combination.
Journal of Applied Behavior Analysis, 2020 · doi:10.1002/jaba.592