Assessment & Research

Evaluation of publication bias in response interruption and redirection: A meta‐analysis

Dowdy et al. (2020) · Journal of Applied Behavior Analysis 2020
★ The Verdict

RIRD helps with nonsocially-maintained problem behavior, but the glow is slightly brighter in print—dig for unpublished data before you bank on it.

✓ Read this if BCBAs running RIRD for vocal stereotypy or SIB in clinic or schools.
✗ Skip if Practitioners who only treat socially-maintained behavior.

01Research in Context

01

What this study did

Dowdy and colleagues looked at every RIRD study they could find. They compared published papers with unpublished ones.

They wanted to see if journals print only the success stories.

02

What they found

RIRD still works for nonsocially-maintained problem behavior in autism.

Published studies show slightly bigger gains than the hidden ones. The gap is small but real.

03

How this fits with other research

Campbell (2003) and Heyvaert et al. (2014) already showed behavioral tactics help in autism. Dowdy zooms in on one tactic and asks, "Are we over-hyping it?"

Kok et al. (2026) used the same math on single-case data. Both teams found treatments work, yet effects shrink when you pool everything.

Tromans et al. (2018) warned that most autism trials are tiny. Dowdy’s bias check adds a new reason to hunt for unpublished files before you trust the mean effect.

04

Why it matters

Before you tell a parent "RIRD will work," search for gray-literature posters or theses on the same procedure. If you can’t find any, quote the smaller, more cautious effect size. It keeps expectations real and saves you from over-promising.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Add one quick search for conference posters or dissertations on RIRD before you write the next treatment plan.

02At a glance

Intervention
not applicable
Design
meta analysis
Population
autism spectrum disorder
Finding
positive

03Original abstract

Publication bias is the disproportionate representation of studies with large effects and statistically significant findings in the published research literature. If publication bias occurs in single-case research design studies on applied behavior-analytic (ABA) interventions, it can result in inflated estimates of ABA intervention effects. We conducted an empirical evaluation of publication bias on an evidence-based ABA intervention for children diagnosed with autism spectrum disorder, response interruption and redirection (RIRD). We determined effect size estimates for published and unpublished studies using 3 metrics, percentage of nonoverlapping data (PND), Hedges' g, and log response ratios (LRR). Omnibus effect size estimates across all 3 metrics were positive, supporting that RIRD is an effective treatment for reducing problem behavior maintained by nonsocial consequences. We observed larger PND for published compared to unpublished studies, small and nonsignificant differences in LRR for published compared to unpublished studies, and significant differences in Hedges' g for published compared to unpublished studies, with published studies showing slightly larger effect. We found little, if any, difference in methodological quality between published and unpublished studies. While RIRD appears to be an effective intervention for challenging behavior maintained by nonsocial consequences, our results reflect some degree of publication bias present in the RIRD research literature.

Journal of Applied Behavior Analysis, 2020 · doi:10.1002/jaba.724