Assessment & Research

Quantitative integration of single-subject studies: Methods and misinterpretations.

Kollins et al. (1999) · The Behavior analyst 1999
★ The Verdict

Pooling single-subject studies with numbers is OK if you treat it as integration, not meta-analysis.

✓ Read this if BCBAs writing literature reviews or grant proposals that need numbers.
✗ Skip if Practitioners who only run one-client programs and never read reviews.

01Research in Context

01

What this study did

Reid et al. (1999) wrote a position paper. They argued that you can pool single-subject data with numbers. They said this is not the same as regular meta-analysis.

The paper is theoretical. No new data were collected. It answers critics who say numbers kill the spirit of behavior analysis.

02

What they found

The authors found that counting effects across single-case studies is fair game. They said the method keeps the idea of individual control.

They warned: call it quantitative integration, not meta-analysis. The label matters to stay true to our field.

03

How this fits with other research

Gaily et al. (1998) came first and already liked PND-based pooling. Reid et al. (1999) built on that vote of trust.

Nasr et al. (2000) pushed back one year later. They said any average turns single cases into a group comparison. This looks like a fight, but it is about labels and use, not right or wrong.

DeHart et al. (2019) offered a newer path. Mixed-effects models keep each subject’s curve and still give stats. Their work updates the debate without trashing the original idea.

04

Why it matters

You can scan graphs and still run numbers across studies. The paper gives you permission to do both. When you write a review, say “quantitative integration,” not “meta-analysis,” and keep individual plots in the appendix. Monday move: add a PND column to your next literature matrix.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Add a PND column to your current literature matrix to see effect patterns at a glance.

02At a glance

Intervention
not applicable
Design
theoretical
Finding
not reported

03Original abstract

Derenne and Baron (1999) criticized a quantitative literature review by Kollins, Newland, and Critchfield (1997) and raised several important issues with respect to the integration of single-subject data. In their criticism they argued that the quantitative integration of data across experiments conducted by Kollins et al. is a meta-analysis and, as such, is inappropriate. We reply that Kollins et al. offered behavior analysts a technique for integrating quantitative information in a way that draws from the strengths of behavior analysis. Although the quantitative technique is true to the original spirit of meta-analysis, it bears little resemblance to meta-analyses as currently conducted or defined and offers behavior analysts a potentially useful tool for comparing data from multiple sources. We also argue that other criticisms raised by Derenne and Baron were inaccurate or irrelevant to the original article. Our response highlights two main points: (a) There are meaningful quantitative techniques for examining single-subject data across studies without compromising the integrity of behavior analysis; and (b) the healthiest way to refute or question findings in any viable field of scientific inquiry is through empirical investigation.

The Behavior analyst, 1999 · doi:10.1007/BF03391992