Service Delivery

The Use of Evaluation in Treatment Programs for Children with Autism

Miller (2017) · Behavior Analysis in Practice 2017
★ The Verdict

Track two service-wide outcomes and share them each quarter to prove your ABA program works beyond single-client graphs.

✓ Read this if BCBAs who run or supervise autism clinics and need to keep funders happy.
✗ Skip if RBTs who only run 1:1 sessions and never see the bigger data picture.

01Research in Context

01

What this study did

Miller (2017) wrote a how-to paper, not an experiment.

He looked at ABA autism programs across the country.

He asked: why do so few check if the whole program works?

He then built a simple checklist any clinic can use.

02

What they found

Most clinics only graph each child’s data.

They rarely step back to see if the whole service is improving.

Miller shows a fix: list your stakeholders, pick two or three big outcomes, and write a short report every three months.

03

How this fits with other research

Ruppel et al. (2021) used the exact plan. They tracked parent stress and problem behavior across kids and sent quarterly summaries to funders.

Anonymous (2023) did the same with electronic notes. Goal success rose 9.7 % after they started the quarterly reports.

Eskow et al. (2015) ran a statewide check before Miller wrote the guide. Their matched wait-list design showed the power of program-level data, proving the idea works even without the new checklist.

04

Why it matters

You already take data on each client. Add one hour a month to average those graphs into two program-level numbers—like “percentage of kids who cut problem behavior by 20 %” or “average wait time from referral to first session.” Share the number with your boss and payors every quarter. This small step can keep insurance authorizations flowing and show why your clinic deserves referrals.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Open your last 12 client graphs, count how many hit their problem-behavior goal, and email the percentage to your director with the subject line “Q2 Program Outcome.”

02At a glance

Intervention
not applicable
Design
methodology paper
Population
autism spectrum disorder
Finding
not reported

03Original abstract

Program evaluation is the use of planned activities to monitor process, outcomes, and impact of a health program or intervention. The application of program evaluation to behavioral analytic treatment programs for children with autism is a useful and necessary activity to inform practitioners and other stakeholders of the efficacy of these programs and to promote adherence to best-practice treatments. A brief survey of behavioral providers in California and Texas and search of the behavioral literature suggest that the practice of program evaluation is underutilized among providers of behavioral services. Current organizational practices primarily involve reporting on individualized consumer goals. The purpose of this paper is to provide an introduction to evaluation processes and procedures to promote the implementation of some or all of these components. Areas discussed include defining the population served and program stakeholders, describing the program and intervention, selecting evaluation goals and objectives, ethical considerations, and reporting.

Behavior Analysis in Practice, 2017 · doi:10.1007/s40617-016-0130-3