These answers draw in part from “Quality FBAs in Schools: Practical Steps for Understanding and Supporting Student Behaviors” by Kristina Friedrich, M.Ed, BCBA, LBA, CTP (BehaviorLive), and extend it with peer-reviewed research from our library of 27,900+ ABA research articles. Clinical framing, BACB ethics code references, and cross-links below are synthesized by Behaviorist Book Club.
View the original presentation →A functional behavior assessment is a broad term that encompasses all methods used to identify the function of behavior, including interviews, direct observation, record review, and hypothesis development. A functional analysis is a specific experimental procedure in which environmental conditions are systematically manipulated to test hypotheses about behavioral function. In school settings, full functional analyses are rarely conducted due to practical constraints including time, the need to create test conditions that may involve evoking challenging behavior, and the complexity of the classroom environment. Most school-based assessments rely on descriptive FBA methods, which use indirect and direct observation data to develop hypotheses without experimental manipulation.
A comprehensive school-based FBA involves a multidisciplinary team. The behavior analyst or school psychologist typically leads the assessment process and is responsible for synthesizing data and developing the functional hypothesis. Classroom teachers provide critical information about the behavior's occurrence across the school day and participate in developing the BIP. Parents or guardians contribute information about the student's behavior at home and across settings. The student, when developmentally appropriate, provides their own perspective. Administrators ensure that the process meets legal requirements and that resources are available for implementation. Related service providers such as speech-language pathologists or occupational therapists contribute when their areas of expertise are relevant.
A functional hypothesis is supported when data from multiple sources converge on the same function. If teacher interviews, direct observations, and ABC data all suggest that the behavior occurs primarily during academic demands and is followed by removal from the task, the escape hypothesis is well-supported. If different data sources suggest different functions, additional data collection is needed. You can also test your hypothesis through a brief function-based intervention: if an intervention designed for the hypothesized function produces improvement, this provides confirmatory evidence. If it does not, the hypothesis should be revisited. No hypothesis is ever certain without experimental verification, but a well-supported hypothesis based on comprehensive data collection is sufficient for most school-based intervention planning.
The most common mistakes include defining the target behavior too broadly or subjectively, relying on a single data source rather than triangulating across methods, assuming the function based on the behavior's form rather than its environmental context, developing BIPs that focus solely on consequences without teaching replacement behaviors, and failing to assess implementation fidelity after the BIP is in place. Another frequent error is conducting the FBA as a one-time event rather than an ongoing process. If the initial intervention is not effective, the team should revisit the assessment rather than simply adding more consequences.
This is a common and important finding. When the FBA reveals that challenging behavior is escape-maintained and the academic demands exceed the student's current skill level, the most effective intervention involves modifying the academic demands to match the student's instructional level while simultaneously building the skills needed to access the general curriculum. This may require collaboration with the special education team to adjust the student's Individualized Education Program. Teaching the student to request help or request a modified task provides a replacement behavior while the academic skill gap is being addressed. Ignoring the skill deficit and focusing only on the behavioral response to it is unlikely to produce lasting improvement.
Write the BIP in clear, specific, action-oriented language that tells the teacher exactly what to do, what to say, and when. Avoid behavioral jargon. Instead of writing implement DRA on a VR3 schedule, write when the student raises their hand and waits, respond within 30 seconds with specific praise and help with the task. Limit the number of strategies to what the teacher can reasonably manage. Design interventions that fit within existing classroom routines rather than requiring entirely new procedures. Provide hands-on training and modeling before expecting implementation. Check in regularly to troubleshoot barriers and provide positive feedback. The most technically elegant BIP is worthless if it sits in a binder because the teacher could not understand or implement it.
Students should be involved to the greatest extent that their developmental level and communication skills allow. For verbal students, this might include an interview where the student shares their perspective on what triggers their behavior, what they wish were different, and what supports would help. For younger or less verbal students, observing their choices and preferences provides indirect information about their perspective. Including the student affirms their dignity, provides valuable assessment data that adults may not have, and increases the likelihood that the student will engage with the resulting BIP. The student's perspective should be documented as part of the assessment record and considered during intervention planning.
The BIP should be reviewed at predetermined intervals based on the severity of the behavior and the pace of expected progress. For severe or safety-related behaviors, weekly data review is appropriate, with formal team meetings at least monthly. For less severe behaviors, biweekly data review with team meetings every four to six weeks is typically sufficient. Decision rules should be established in advance: if the target behavior has not decreased by a specified amount within a specified timeframe, the team reconvenes. The BIP should be formally revised whenever data indicate that the current plan is not effective, when significant changes occur in the student's environment or circumstances, or at least annually as part of the IEP review process.
When initial data do not reveal a clear pattern, several approaches can help. First, ensure that the behavior is operationally defined precisely enough that all observers are recording the same behavior. Inconsistent data sometimes reflect inconsistent observation rather than genuinely variable behavior. Second, collect more data across a wider range of conditions. Brief observations may miss important patterns. Third, consider whether the behavior may serve multiple functions across different contexts, which requires the BIP to address each function. Fourth, look for setting events or motivating operations that may be modulating the relationship between antecedents, behavior, and consequences. Factors like sleep quality, medication changes, or home stressors can make the same antecedent more or less likely to evoke the behavior on different days.
Equity concerns should be actively considered throughout the FBA process. During the referral stage, examine whether the referral reflects a genuine behavioral concern or whether it may be influenced by cultural bias in behavior expectations. During assessment, consider whether the student's cultural background may affect the interpretation of their behavior. During BIP development, ensure that interventions are culturally responsive and do not require the student to suppress culturally normative behavior. Advocate for FBAs to be used proactively as a supportive tool rather than only as a response to disciplinary actions. Track referral and outcome data disaggregated by race, ethnicity, and disability status to identify and address systemic patterns.
The ABA Clubhouse has 60+ on-demand CEUs including ethics, supervision, and clinical topics like this one. Plus a new live CEU every Wednesday.
Ready to go deeper? This course covers this topic with structured learning objectives and CEU credit.
Quality FBAs in Schools: Practical Steps for Understanding and Supporting Student Behaviors — Kristina Friedrich · 1 BACB Ethics CEUs · $10
Take This Course →We extended these answers with research from our library — dig into the peer-reviewed studies behind the topic, in plain-English summaries written for BCBAs.
280 research articles with practitioner takeaways
279 research articles with practitioner takeaways
258 research articles with practitioner takeaways
1 BACB Ethics CEUs · $10 · BehaviorLive
Research-backed educational guide with practice recommendations
Side-by-side comparison with clinical decision framework
You earn CEUs from a dozen different places. Upload any certificate — from here, your employer, conferences, wherever — and always know exactly where you stand. Learning, Ethics, Supervision, all handled.
No credit card required. Cancel anytime.
All behavior-analytic intervention is individualized. The information on this page is for educational purposes and does not constitute clinical advice. Treatment decisions should be informed by the best available published research, individualized assessment, and obtained with the informed consent of the client or their legal guardian. Behavior analysts are responsible for practicing within the boundaries of their competence and adhering to the BACB Ethics Code for Behavior Analysts.