Starts in:

Building Sustainable Behavior Plans in Schools: Treatment Integrity, Staff Training, and Systems-Level Support

Source & Transformation

This guide draws in part from “Behavior Plans that Stick- Considerations for the School Setting” by Kristina Friedrich, M.Ed, BCBA, LBA, CTP (BehaviorLive), and extends it with peer-reviewed research from our library of 27,900+ ABA research articles. Citations, clinical framing, and cross-links below are synthesized by Behaviorist Book Club.

View the original presentation →
In This Guide
  1. Overview & Clinical Significance
  2. Background & Context
  3. Clinical Implications
  4. Ethical Considerations
  5. Assessment & Decision-Making
  6. What This Means for Your Practice

Overview & Clinical Significance

Every school-based behavior analyst has had the experience: a carefully designed behavior intervention plan sits in a binder on a teacher's shelf, partially implemented on good days and ignored on stressful ones. The plan is sound. The functional assessment was thorough. The replacement behavior is appropriate. Yet the student's behavior has not changed because the plan was never implemented with enough consistency to allow the behavioral contingencies to take effect.

Kristina Friedrich's training on behavior plans that stick addresses this implementation gap head-on. Treatment integrity, the degree to which an intervention is implemented as designed, is the mediating variable between a well-designed behavior plan and improved student outcomes. Without adequate treatment integrity, even the most evidence-based intervention cannot produce its intended effects. When outcomes are poor and integrity data are absent, it becomes impossible to determine whether the intervention failed or whether the intervention was never truly tested.

In school settings, treatment integrity faces unique threats. The implementing agents are typically teachers and paraprofessionals who have limited training in behavioral principles, competing demands for their attention, large numbers of students, rigid schedules, and minimal ongoing support from the behavior analyst who designed the plan. The behavior analyst often functions as a consultant who visits periodically rather than a direct implementer who is present throughout the day.

This creates a fundamental challenge: the person who understands the behavioral principles is not the person carrying out the procedures, and the person carrying out the procedures may not fully understand why each component matters. Friedrich's training addresses this gap by focusing on practical strategies for training school staff, building systems that support consistent implementation, and identifying and removing barriers to fidelity.

The training's emphasis on collaboration with educators and paraprofessionals reflects an important recognition: treatment integrity is not solely a function of the implementer's knowledge or motivation. It is a function of the system in which the implementer operates. When systems support implementation through clear expectations, adequate training, ongoing feedback, and reasonable demands, fidelity improves. When systems fail to provide these supports, even the most dedicated implementer will struggle.

Your CEUs are scattered everywhere.Between what you earn here, your employer, conferences, and other providers — it adds up fast. Upload any certificate and just know where you stand.
Try Free for 30 Days

Background & Context

The concept of treatment integrity has been recognized in behavior analytic literature for decades, yet its measurement and promotion remain inconsistent in practice. Published research repeatedly demonstrates the correlation between treatment integrity and client outcomes, and also documents how rarely treatment integrity data are collected in applied settings.

In school-based ABA, the treatment integrity problem is structural. The consultation model that dominates school-based practice means that behavior analysts design interventions for others to implement. This introduces a translation layer between the plan and its execution. The behavior analyst understands the contingencies and the rationale for each component. The teacher receives a written plan and perhaps a brief training session. The paraprofessional may receive instructions secondhand from the teacher. By the time the plan reaches the person interacting with the student most frequently, critical details may have been lost or misunderstood.

Common barriers to treatment integrity in schools include insufficient training time, with in-service sessions often lasting 30 minutes or less for plans that require significant behavior change from the implementer. Competing priorities pull attention away from behavioral procedures, as academic instruction, safety management, and administrative demands all compete for the same finite attention. Staff turnover creates a revolving door of implementers who each need to be trained. Physical and resource constraints may make certain procedures impractical in the classroom. And the absence of performance feedback means that implementers receive no data on how well they are implementing the plan.

The research on training school staff in behavioral procedures consistently identifies several effective components: clear written descriptions of procedures, modeling of the target behaviors by the trainer, opportunities for rehearsal and practice, performance feedback with both positive and corrective components, and ongoing support after the initial training session. Brief, focused training combined with in-vivo coaching produces better outcomes than lengthy didactic presentations.

Friedrich's training builds on this research base by providing practical systems that school-based behavior analysts can implement within the real constraints of their settings. The focus is not just on training individual implementers but on building organizational systems that sustain implementation over time, survive staff turnover, and self-correct when drift occurs.

Clinical Implications

When treatment integrity is low, the clinical implications cascade through every aspect of the student's programming. The most immediate consequence is that the intervention does not produce the expected outcomes. Reinforcement that is delivered inconsistently or non-contingently loses its effectiveness. Antecedent modifications that are applied sometimes but not others fail to establish the stimulus control necessary for the student to discriminate appropriate from inappropriate conditions. Planned ignoring that breaks down during escalation inadvertently reinforces more intense behavior.

Beyond the direct behavioral effects, poor treatment integrity corrupts the data that drive clinical decisions. If a behavior analyst reviews outcome data showing no improvement and does not have integrity data to contextualize those outcomes, they may conclude that the intervention is ineffective and change the plan. The new plan is implemented with similarly poor integrity, also fails to produce results, and the cycle continues. The student accumulates a history of "failed" interventions when in fact no intervention was ever adequately tested. This is both a clinical failure and an ethical one.

Treatment integrity also affects the behavior of the implementers themselves. When a teacher implements a plan inconsistently, they experience inconsistent results. Sometimes the student's behavior improves (when integrity was high) and sometimes it worsens (when integrity was low). This intermittent reinforcement schedule for the teacher's plan-following behavior is precisely the schedule that maintains irregular patterns. The teacher concludes that the plan "sometimes works and sometimes doesn't" rather than recognizing that their implementation was the variable.

For the student, inconsistent implementation creates confusion about the contingencies in effect. A student who is sometimes reinforced for using a replacement behavior and sometimes reinforced for using problem behavior (because the planned ignoring procedure was not followed) is experiencing a concurrent schedule that may maintain problem behavior indefinitely.

The school setting adds layers of complexity because behavior plans often require coordination across multiple adults. A student who moves between a general education classroom, a resource room, specials, and lunch encounters different implementers in each setting. If the plan is implemented well in one setting but poorly in others, the student must discriminate where the contingencies are in effect. For students who already struggle with generalization, this inconsistency is particularly detrimental.

Measuring treatment integrity does not require complex procedures. Direct observation using a checklist of plan components, self-report measures, and permanent product measures can all provide useful data. The key is that some form of integrity measurement occurs regularly and that the data inform clinical decisions.

FREE CEUs

Get CEUs on This Topic — Free

The ABA Clubhouse has 60+ on-demand CEUs including ethics, supervision, and clinical topics like this one. Plus a new live CEU every Wednesday.

60+ on-demand CEUs (ethics, supervision, general)
New live CEU every Wednesday
Community of 500+ BCBAs
100% free to join
Join The ABA Clubhouse — Free →

Ethical Considerations

Treatment integrity sits at the intersection of multiple ethical obligations in the BACB Ethics Code for Behavior Analysts.

Code 2.01 (Providing Effective Treatment) is compromised whenever an intervention is not implemented with sufficient integrity to produce its intended effects. A behavior analyst who designs evidence-based plans but does not establish systems for ensuring implementation fidelity is not, in practice, providing effective treatment. The ethical obligation extends beyond plan design to encompass the implementation support systems necessary for the plan to function.

Code 2.13 (Selecting, Designing, and Implementing Behavior-Change Interventions) requires behavior analysts to select interventions based on the best available evidence and to implement them effectively. Implementation includes ensuring that those who carry out the plan are adequately trained and supported. A plan that is evidence-based in design but poorly implemented in practice does not meet this standard.

Code 2.14 (Selecting, Designing, and Implementing Assessments) applies to treatment integrity measurement itself. Assessment of treatment integrity should be part of the behavior analyst's standard assessment protocol. Without integrity data, clinical decisions about whether to modify or continue an intervention are made on incomplete information.

Code 4.05 (Using Evidence-Based Supervision and Training Practices) is directly relevant when behavior analysts are responsible for training school staff to implement behavior plans. The training methods used should themselves be evidence-based: modeling, rehearsal, performance feedback, and ongoing coaching. Handing a teacher a written plan with a brief verbal explanation does not constitute evidence-based training.

Code 2.09 (Involving Clients and Stakeholders) supports the collaborative approach that Friedrich's training emphasizes. Teachers and paraprofessionals are key stakeholders in school-based behavior programming. Their input on what is feasible, what barriers they anticipate, and what support they need should inform the plan design. Involvement is not just ethical window dressing; it improves the likelihood that the plan will be implemented because stakeholders who helped design the plan have greater understanding of and investment in its success.

There is also an ethical dimension to the behavior analyst's response when integrity data reveal poor implementation. The appropriate response is to diagnose and address the barriers to implementation, not to blame the implementer. If a teacher is not following a behavior plan, the first question should be whether the plan was feasible, whether the training was adequate, whether the feedback and support systems are in place, and whether the organizational context supports implementation. Attributing poor integrity to implementer motivation or character without assessing environmental variables violates the fundamental behavioral principle that behavior is a function of its environment.

Assessment & Decision-Making

Assessing and promoting treatment integrity requires a multi-level approach that addresses individual implementer skills, dyadic interactions between the behavior analyst and implementer, and organizational systems.

At the individual level, assessment begins with a skills assessment of the implementer. Can the teacher or paraprofessional accurately describe the plan's components? Can they demonstrate the procedures in a role-play? Do they understand the rationale for each component? Identifying skill deficits before implementation begins allows for targeted training that addresses gaps rather than delivering generic information.

Direct observation is the most valid method for assessing treatment integrity. The behavior analyst observes the implementer during a typical session or instructional period and records whether each plan component was delivered as specified. A treatment integrity checklist derived from the behavior plan provides a structured observation tool. Each component of the plan is listed, and the observer records whether it was implemented correctly, partially implemented, or not implemented.

Self-report measures, while less valid than direct observation, provide useful supplementary data and can be collected more frequently. A brief daily checklist that the implementer completes, indicating which plan components they used and any difficulties encountered, creates a data stream that the behavior analyst can review at each consultation visit.

Permanent product measures offer another assessment option. If the plan requires data collection by the implementer, the completeness and accuracy of the data sheets serve as a proxy for implementation quality. Consistent data collection does not guarantee that all plan components were implemented correctly, but absent or incomplete data sheets often correlate with poor implementation overall.

Decision-making based on integrity data follows a structured protocol. When integrity is high and outcomes are positive, maintain the current plan and continue monitoring. When integrity is high but outcomes are not improving, the plan itself may need modification because the intervention has been adequately tested and found insufficient. When integrity is low and outcomes are poor, prioritize improving integrity before modifying the plan. When integrity is low but outcomes are somehow positive, investigate whether the student's improvement is attributable to other variables.

The most common decision error in school-based practice is modifying an intervention when integrity has not been assessed. This leads to plan churn where the student cycles through multiple interventions, none of which were given an adequate test. Building integrity assessment into the decision-making protocol prevents this error and produces more efficient, effective clinical practice.

What This Means for Your Practice

If you are a school-based behavior analyst, treatment integrity measurement and promotion should be a non-negotiable component of your practice, not an occasional add-on.

Build treatment integrity checklists into every behavior plan you write. When you design the plan, simultaneously design the integrity tool. Each component of the plan should correspond to an observable, measurable item on the checklist. Train yourself to never present a behavior plan without the accompanying integrity monitoring tool.

Restructure your staff training approach. Replace lengthy in-service presentations with brief, focused training sessions that include modeling of each procedure, opportunity for the implementer to practice with feedback, and a scheduled follow-up observation within one week. The follow-up observation is critical because it closes the loop between training and implementation, providing the performance feedback that shapes accurate implementation.

Develop tiered support systems that match the level of support to the implementer's needs. New staff or staff implementing complex plans need more frequent observation and feedback. Experienced staff implementing familiar plans may need only periodic integrity checks. This tiered approach makes the most efficient use of your consultation time.

When integrity data reveal problems, investigate before intervening. Observe the implementation context and talk to the implementer about barriers. Is the issue a skill deficit (they do not know how to implement a component), a performance deficit (they know how but something prevents them from doing it), or a contextual barrier (the plan does not fit the environment)? Each cause requires a different response: additional training for skill deficits, motivational or environmental modifications for performance deficits, and plan revision for contextual barriers.

Advocate within your organization for the time and resources needed to support treatment integrity. If your caseload prevents you from observing implementation and providing feedback, that is an organizational barrier that affects every student on your caseload. Document the relationship between your consultation time, integrity levels, and student outcomes to make the case for adequate resource allocation.

Earn CEU Credit on This Topic

Ready to go deeper? This course covers this topic in detail with structured learning objectives and CEU credit.

Behavior Plans that Stick- Considerations for the School Setting — Kristina Friedrich · 1 BACB Ethics CEUs · $15

Take This Course →

Research Explore the Evidence

We extended this guide with research from our library — dig into the peer-reviewed studies behind the topic, in plain-English summaries written for BCBAs.

Social Cognition and Coherence Testing

280 research articles with practitioner takeaways

View Research →

Measurement and Evidence Quality

279 research articles with practitioner takeaways

View Research →

Symptom Screening and Profile Matching

258 research articles with practitioner takeaways

View Research →
CEU Buddy

No scramble. No surprises.

You earn CEUs from a dozen different places. Upload any certificate — from here, your employer, conferences, wherever — and always know exactly where you stand. Learning, Ethics, Supervision, all handled.

Upload a certificate, everything else is automatic Works with any ACE provider $7/mo to protect $1,000+ in earned CEUs
Try It Free for 30 Days →

No credit card required. Cancel anytime.

Clinical Disclaimer

All behavior-analytic intervention is individualized. The information on this page is for educational purposes and does not constitute clinical advice. Treatment decisions should be informed by the best available published research, individualized assessment, and obtained with the informed consent of the client or their legal guardian. Behavior analysts are responsible for practicing within the boundaries of their competence and adhering to the BACB Ethics Code for Behavior Analysts.

60+ Free CEUs — ethics, supervision & clinical topics