Assessment & Research

Results reporting in single case experiments and single case meta-analysis.

Vannest et al. (2018) · Research in developmental disabilities 2018
★ The Verdict

Copy the authors’ sentence starters so your single-case paper can join tomorrow’s meta-analysis.

✓ Read this if BCBAs who write or read single-case reports.
✗ Skip if Clinicians who only consume group-design studies.

01Research in Context

01

What this study did

Sasson et al. (2018) built a fill-in-the-blanks checklist for writing up single-case studies.

The list tells you exactly what to say about design, graphs, stats, and safety rules.

Goal: every paper can enter a future meta-analysis without missing pieces.

02

What they found

The checklist is ready to copy-paste into your next report.

No numbers were tested; this is a how-to guide, not an experiment.

03

How this fits with other research

Weeden et al. (2010) looked back at 14 years of self-injury FA papers and found most skipped safety details. J et al. gives the sentences that could fix that gap.

Cymbal et al. (2022) counted OBM studies and saw only 1 in 4 reported procedural integrity. Again, J et al. shows where to drop that line.

Aydin et al. (2022) added a new effect-size tool (PCES) for meta-analysis. J et al. does not pick metrics, but its clear labels let any metric plug in later.

Davis et al. (2018) warned that different effect-size formulas can flip conclusions. J et al. does not solve the math fight, yet full reporting makes the fight visible.

04

Why it matters

Next time you write a single-case study, paste the scaffold into your method section. Add the missing lines on session limits, integrity checks, and visual rules. Your paper will slide into future meta-analyses and keep participants safer.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Open your last single-case draft and add the scaffold headings: design, visual analysis, stats, safety.

02At a glance

Intervention
not applicable
Design
methodology paper
Finding
not reported

03Original abstract

Single Case Experimental Design is a discipline grounded in applied behavior analysis where the needs of individual clients and the application of scientific inquiry are fundamental tenets. These two principles remain tantamount in the conduct of research using this methodology and the expansion of the method into evidence-based practice determinations. Although recommendations for quality indicators are widespread, implementation is not. Concurrent to the rise of quality indicators is an increasing interest in analysis methodology. Visual analysis has a history of application and validity, newer forms of analysis less so. While some argue for concordance between the two, it may be the differences that are worth exploration in understanding characteristics of trend and variability in much of the published literature. Design choice and visual analysis decisions are rarely fully articulated. Statistical analyses are likewise inadequately justified or described. Recommendations for the explicit language of reporting as derived from prior meta-analysis and a current review of two leading journals provides a scaffold consistent with existing guidelines but additive in detail, exemplars, and justification. This is intended to improve reporting of results for individual studies and their potential use in future meta-analytic work.

Research in developmental disabilities, 2018 · doi:10.1016/j.ridd.2018.04.029