Starts in:

Supervision CSI: Frequently Asked Questions on Investigative Supervision in ABA

Source & Transformation

These answers draw in part from “Supervision CSI: Investigate, Analyze, Solve” by Nicole Stewart, MSEd, BCBA, LBA-NY/NJ (BehaviorLive), and extend it with peer-reviewed research from our library of 27,900+ ABA research articles. Clinical framing, BACB ethics code references, and cross-links below are synthesized by Behaviorist Book Club.

View the original presentation →
Questions Covered
  1. How do the seven dimensions of ABA function as diagnostic tools in supervision?
  2. What data sources should supervisors collect to assess trainee performance accurately?
  3. How should supervisors structure feedback when using a data-based supervision model?
  4. What is the science-practitioner model and why does it matter for ABA supervisors?
  5. How does investigative supervision differ from traditional mentorship-style supervision?
  6. How should supervisors handle trainees who resist corrective feedback?
  7. What role does the BACB task list play in structuring investigative supervision?
  8. How do you design a supervision agenda using the CSI model?
  9. Can the CSI model be applied to group supervision settings?
  10. What documentation practices support data-based supervision?
Your CEUs are scattered everywhere.Between what you earn here, your employer, conferences, and other providers — it adds up fast. Upload any certificate and just know where you stand.
Try Free for 30 Days

1. How do the seven dimensions of ABA function as diagnostic tools in supervision?

Each dimension identifies a distinct class of performance deficit. A trainee whose interventions lack social validity has an applied deficit. Inconsistent data collection points to a behavioral dimension deficit. Poor experimental reasoning under naturalistic conditions suggests an analytic weakness. Vague or incomplete procedure descriptions indicate a technological deficit. Failure to connect procedures to core ABA principles reflects a conceptually systematic gap. Slow or absent skill acquisition points to an effectiveness deficit. Failure to generalize skills across settings or clients reveals a generality problem. Naming the deficit dimensionally tells the supervisor exactly what kind of training will remediate it — for example, generality deficits require explicit programming across multiple exemplars, not re-instruction on the same task.

2. What data sources should supervisors collect to assess trainee performance accurately?

Effective supervisory assessment draws from at least three sources: direct observation during service delivery, review of permanent products (session notes, data sheets, BSPs), and structured probes during supervision meetings such as case conceptualizations or role plays. Each source captures different facets of the trainee's repertoire. Direct observation reveals real-time implementation fidelity. Permanent products expose the quality of the trainee's clinical reasoning and documentation. Verbal behavior probes assess whether the trainee can apply conceptual knowledge to novel cases — a critical competence that routine implementation data cannot capture. Relying on any single source risks missing important deficits or misidentifying their nature.

3. How should supervisors structure feedback when using a data-based supervision model?

Feedback should be anchored to specific, observable data rather than general impressions. Rather than telling a trainee "your reinforcement delivery needs improvement," the supervisor references specific observation data: "In the 30-minute session I observed on Tuesday, there were 14 opportunities for differential reinforcement and you delivered it in 6 instances — that's 43% fidelity against the target of 80%." This precision makes correction actionable. The trainee knows exactly what behavior needs to change, in what context, and by how much. Behavioral Skills Training then provides the structure for remediation: re-state the target behavior, model correct performance, have the trainee practice, and deliver immediate feedback. This cycle repeats until criterion is met.

4. What is the science-practitioner model and why does it matter for ABA supervisors?

The science-practitioner model holds that effective clinicians function as applied scientists — generating hypotheses about behavior, testing those hypotheses through systematic observation, and revising practice based on data. In supervision, this means treating trainee behavior as the subject of analysis rather than simply commenting on it. The supervisor observes trainee behavior, forms hypotheses about why performance deficits occur, designs interventions (training strategies), implements them, and evaluates outcomes. This model matters because it makes supervision empirically accountable. Progress is not assumed based on time spent or topics covered; it is measured through pre-post data on the trainee's target behaviors. This approach aligns directly with BACB requirements for individualized, outcome-focused supervision.

5. How does investigative supervision differ from traditional mentorship-style supervision?

Traditional mentorship supervision tends to be advice-based, reactive, and relationship-centered. The experienced clinician shares wisdom, the trainee absorbs it, and growth is expected to follow. Investigative supervision is hypothesis-driven, proactive, and data-centered. The supervisor does not wait for the trainee to report a problem — they systematically look for performance gaps before they surface as clinical failures. The relationship remains important (working alliance affects how trainees receive feedback), but relationship quality is not a substitute for systematic assessment. In investigative supervision, the supervisor asks: what does the data show about this trainee's repertoire? In mentorship supervision, the supervisor asks: what advice can I offer from my experience? The first question generates better outcomes for trainees and clients.

6. How should supervisors handle trainees who resist corrective feedback?

Resistance to feedback is itself a behavior with a function. Before attributing it to poor attitude or motivation, the supervisor should assess functionally: Is the feedback being delivered in a way that triggers aversive reactions (e.g., during group meetings, in front of peers, using evaluative language)? Does the trainee have the prerequisite skills to perform the target behavior, or is feedback being delivered on a skill the trainee cannot yet execute? Is there a history of punishment associated with supervisory feedback that is suppressing disclosure of errors? Once the function is hypothesized, the supervisor adjusts the feedback context, delivery, or content accordingly. Code 5.06 also requires supervisors to provide feedback in a timely and constructive manner — which means attending to the conditions under which feedback is received, not just its content.

7. What role does the BACB task list play in structuring investigative supervision?

The task list provides the operational taxonomy for supervisory assessment. Each item represents a specific behavior-analytic competency that can be observed, probed, and measured. By organizing supervisory data collection around task list items rather than informal impressions, supervisors create a comprehensive, defensible record of trainee development. Gaps become visible early — if a trainee has never been assessed on functional analysis methodology, that is a supervisory data gap that needs to be addressed before the trainee sits for their exam or practices independently. The task list also gives trainees a clear roadmap: they know what competencies are being assessed, how those assessments will be conducted, and what criterion-level performance looks like. This transparency reduces anxiety and increases trainee buy-in to the supervisory process.

8. How do you design a supervision agenda using the CSI model?

The agenda should be built from data collected since the last meeting, not from a predetermined topic list. Before each supervision session, the supervisor reviews recent observation notes, session documentation, and any flags from the trainee's caseload. These data generate the agenda: which skill gaps need remediation, which recently trained skills need fidelity checks, which cases need clinical decision-making consultation. Each agenda item should include a specific trainee behavior target, the data source that generated it, the planned training activity (review, modeling, rehearsal, discussion), and a criterion for success. This structure ensures that each supervision hour produces measurable progress toward specific competency targets rather than general professional development.

9. Can the CSI model be applied to group supervision settings?

Yes, with modifications. Group supervision allows for peer observation, shared case conceptualizations, and group role plays that expose multiple trainees' reasoning simultaneously — all of which generate diagnostic data. The supervisor can present a case scenario and ask each trainee to independently write a functional hypothesis or select an intervention, then use the variation in responses to identify who has solid conceptual understanding and who needs additional work. The limitation is individualization: group supervision data must be supplemented with individual observation and one-on-one probes to accurately assess each trainee's specific competency profile. BACB requirements also specify that individual supervision contacts cannot be fully replaced by group formats, so the CSI model should span both contexts.

10. What documentation practices support data-based supervision?

Supervisors should maintain three types of records: observation logs with specific behavioral data (not just impressionistic notes), competency tracking matrices that map each trainee's assessed skills against the BACB task list, and meeting notes that record what was discussed, what training activities were completed, what criterion was set, and what data will be collected before the next meeting. These records serve multiple purposes. They fulfill BACB documentation requirements for supervision. They provide the supervisor with running data on trainee progress. And they give trainees explicit feedback on their development trajectory. When trainees can see their own competency data over time, self-monitoring improves and they become more active partners in identifying their own skill gaps.

FREE CEUs

Get CEUs on This Topic — Free

The ABA Clubhouse has 60+ on-demand CEUs including ethics, supervision, and clinical topics like this one. Plus a new live CEU every Wednesday.

60+ on-demand CEUs (ethics, supervision, general)
New live CEU every Wednesday
Community of 500+ BCBAs
100% free to join
Join The ABA Clubhouse — Free →

Earn CEU Credit on This Topic

Ready to go deeper? This course covers this topic with structured learning objectives and CEU credit.

Supervision CSI: Investigate, Analyze, Solve — Nicole Stewart · 1.5 BACB Supervision CEUs · $15

Take This Course →
📚 Browse All 60+ Free CEUs — ethics, supervision & clinical topics in The ABA Clubhouse

Research Explore the Evidence

We extended these answers with research from our library — dig into the peer-reviewed studies behind the topic, in plain-English summaries written for BCBAs.

Social Cognition and Coherence Testing

280 research articles with practitioner takeaways

View Research →

Measurement and Evidence Quality

279 research articles with practitioner takeaways

View Research →

Symptom Screening and Profile Matching

258 research articles with practitioner takeaways

View Research →

Related Topics

CEU Course: Supervision CSI: Investigate, Analyze, Solve

1.5 BACB Supervision CEUs · $15 · BehaviorLive

Guide: Supervision CSI: Investigate, Analyze, Solve — What Every BCBA Needs to Know

Research-backed educational guide with practice recommendations

Decision Guide: Comparing Approaches

Side-by-side comparison with clinical decision framework

CEU Buddy

No scramble. No surprises.

You earn CEUs from a dozen different places. Upload any certificate — from here, your employer, conferences, wherever — and always know exactly where you stand. Learning, Ethics, Supervision, all handled.

Upload a certificate, everything else is automatic Works with any ACE provider $7/mo to protect $1,000+ in earned CEUs
Try It Free for 30 Days →

No credit card required. Cancel anytime.

Clinical Disclaimer

All behavior-analytic intervention is individualized. The information on this page is for educational purposes and does not constitute clinical advice. Treatment decisions should be informed by the best available published research, individualized assessment, and obtained with the informed consent of the client or their legal guardian. Behavior analysts are responsible for practicing within the boundaries of their competence and adhering to the BACB Ethics Code for Behavior Analysts.

60+ Free CEUs — ethics, supervision & clinical topics