Service Delivery

Applications of operant demand to treatment selection <scp>III</scp>: Consumer behavior analysis of treatment choice

Gilroy et al. (2022) · Journal of the Experimental Analysis of Behavior 2022
★ The Verdict

Community buzz beats data in caregiver treatment choices—package your EBP with visible social support to outcompete fads.

✓ Read this if BCBAs who write treatment plans for kids with autism and market them to parents
✗ Skip if Clinicians working in fully payer-controlled systems where caregivers never see a menu

01Research in Context

01

What this study did

Gilroy et al. (2022) asked the caregivers to pick autism treatments from a menu. Each option showed two numbers: how well it worked (utilitarian) and how much community support it had (informational).

The team used a fake web store. Caregivers clicked the treatments they would buy. The store rotated the numbers so the same treatment looked better or worse on different pages.

02

What they found

Caregivers chose treatments with strong community buzz, even when the data said they worked poorly. Social reinforcement beat hard evidence almost every time.

This explains why non-EBPs stay popular. A chatty Facebook group can outsell a stack of peer-reviewed charts.

03

How this fits with other research

Schreck et al. (2016) saw the same pattern in BCBAs. Both studies show that ease and peer talk trump data on both sides of the supply chain.

van der Miesen et al. (2024) proved SIB treatments work great at home. Yet Gilroy’s caregivers still under-weight that strong utilitarian proof. The meta-analysis gives the very numbers they ignored.

Cox (2024) argues payers will soon reward social value. Gilroy’s data say social value already drives choices, so value-based care should count community buzz as a real asset, not noise.

04

Why it matters

You can’t beat Instagram hype with a bar graph. Package your EBP with visible parent meet-ups, closed Facebook lives, and peer testimonials. Make the social reinforcement as loud as the data.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Add a parent-peer video testimonial to your next treatment proposal packet

02At a glance

Intervention
not applicable
Design
survey
Sample size
104
Finding
not reported

03Original abstract

Behavior analysts and psychologists advocate for the use of therapies and strategies based on credible, scientific evidence. Researchers and clinicians regularly advocate for Evidence-based Practices (EBPs) over questionable "alternatives" because caregivers seldom choose interventions based on scientific evidence alone. This study applied methods and concepts from Consumer Behavior Analysis to conduct a reinforcer-based evaluation of the consequences that influence treatment choices. Hypothetical Treatment Purchase Tasks (HTPTs) were designed to evaluate how utilitarian (UR; i.e., the efficacy of treatment) and informational sources of reinforcement (IR; i.e., community support for treatment) jointly influence treatment-related choices. A total of 104 caregivers were recruited using the Amazon Mechanical Turk (MTurk) framework to complete two HTPTs. Results indicated that caregivers overall favored treatments with greater IR over those with greater UR, suggesting that indirect contingencies for treatment choices exerted greater overall influence than the direct contingencies of treatment choices (i.e., efficacy). This finding extends the literature on treatment choice by providing a reinforcer-based perspective on why 'fad', questionable, and pseudoscientific practices can achieve and maintain high levels of adoption by caregivers. This work concludes with a discussion of Consumer Behavior Analysis and how reinforcer-based interpretations of choice can be used to improve efforts to support and advocate for evidence-based child behavior treatments.

Journal of the Experimental Analysis of Behavior, 2022 · doi:10.1002/jeab.758