Assessment & Research

The good, the bad, and the aggregate.

Critchfield et al. (2000) · The Behavior analyst 2000
★ The Verdict

Pooling single-subject findings is not selling out—it is how you spot the behavior principles that hold for most clients.

✓ Read this if BCBAs who write literature reviews, grant proposals, or practice guidelines.
✗ Skip if Clinicians looking for a new intervention protocol.

01Research in Context

01

What this study did

Geckeler et al. (2000) wrote a think-piece, not an experiment. They asked: is it okay for behavior analysts to add up results from many single-subject studies?

The authors say yes. They argue that counting and averaging findings is the only way to see which behavior laws hold across people, places, and times.

02

What they found

The paper finds no new data. Instead it gives a road map: gather studies, extract numbers, look for patterns, and still keep single-subject values in view.

They claim this blend keeps the soul of behavior analysis while borrowing the muscle of meta-analysis.

03

How this fits with other research

Reid et al. (1999) made the same point one year earlier. Both papers defend adding single-subject data, but the 1999 team warns: call it "quantitative integration," not meta-analysis. Geckeler et al. (2000) widen the plea to any kind of literature review.

Slaton et al. (2018) later showed the idea in action. Their review lumped together 80 functional-analysis studies that used synthesized conditions. The paper proves behavior analysts already pool data when it helps.

Halstead (2002) went further and counted every citation in ABA journals. That bibliometric study is a live example of the 2000 paper’s call: add up the evidence to see the field’s true shape.

04

Why it matters

If you stick to one-client graphs alone, you may miss big trends. Try a quick count in your next literature search: tally how often a procedure works across the last ten studies. That tiny step is quantitative synthesis, and this paper says it is allowed.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

After you graph this week’s client data, open a spreadsheet, list the last five similar cases, and note whether the same contingency produced the same change.

02At a glance

Intervention
not applicable
Design
theoretical
Finding
not reported

03Original abstract

To evaluate progress and focus goals, scientific disciplines need to identify relations that are robust across many situations. One approach is the literature review, which characterizes generality across studies. Some writers (e.g., Baron & Derenne, 2000) claim that quantitative literature reviews, but not narrative reviews, violate the methodological precepts of behavior analysis by pooling data from nonidentical studies. We argue that it is impossible to assess generality without varying the context in which relationships are studied. Properly chosen data-aggregation strategies can reveal which behavior-environment relations are general and which are procedure dependent. Within behavior analysis, reluctance to conduct quantitative reviews may reflect unsupported assumptions about the consequences of aggregating data across studies. Whether specific data-aggregation techniques help or harm a research program is an empirical issue that cannot be resolved by unstructured discussion. Some examples of how aggregation has been used in identifying behavior-environment relations are examined.

The Behavior analyst, 2000 · doi:10.1007/BF03392005