Just How Effective is Direct Instruction?
Talk about Direct Instruction in "half of failing kids now pass" terms, not decimal effect sizes, and schools will buy in faster.
01Research in Context
What this study did
Mason et al. (2021) wrote a think-piece about Direct Instruction.
They did not run new experiments. They pulled old data and showed it in a new way.
The goal was to help teachers see how big the DI effect really is.
What they found
When you show DI results in a binomial effect-size display, about half of the kids who were failing move into the passing zone.
That picture is easier for principals and parents to grasp than a decimal called "effect size."
The authors say we should talk about DI in "kids moved to success" language, not in statistics jargon.
How this fits with other research
Williams (1996) made a similar move for fluency-based teaching. That paper said speed plus accuracy locks skills in. Mason’s paper does the same job for DI: it repackages old numbers so schools will listen.
Sönmez et al. (2025) actually tested a tiny fluency package with three high-school students. They got fast gains in math facts. Mason would call that a micro-example of what DI does at scale.
van Schrojenstein Lantman-de Valk et al. (2006) showed you can test a reading fix in under 30 minutes. Mason’s point: once you find the best package, DI can deliver it to a whole class with the same clarity.
Why it matters
Stop quoting effect sizes in IEP meetings. Say "With DI, about half of the kids who are behind catch up." Bring the binomial picture on one slide. Principals get it, parents get it, and you walk out with approval to start the program Monday.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Make a one-slide bar graph: red bar "kids failing," blue bar "kids passing after DI"—show it at your next staffing.
02At a glance
03Original abstract
Despite overwhelming evidence in support of Direct Instruction, this research-validated curriculum has not been widely embraced by teachers or school administrators. The Direct Instruction model, developed and refined by Engelmann and colleagues over the past 50 years, has been the focus of numerous research studies, systematic reviews, and meta-analyses. Although its efficacy cannot be doubted, the significance of Direct Instruction’s impact may be misunderstood. We attempt to clarify the importance of Direct Instruction with help from the binomial effect-size display. Binomial effect-size displays allow for intuitive and informative data-based decision making by clearly conveying the real-world importance of treatment outcomes through a juxtaposition of the relative proportions of success. The limitations of analyzing effect sizes in absolute terms are discussed. Using the binomial effect-size display as a framework, we present a series of dichotomies in an attempt to answer the question: Just how effective is Direct Instruction?
Perspectives on Behavior Science, 2021 · doi:10.1007/s40614-021-00295-x