By Matt Harrington, BCBA · Behaviorist Book Club · April 2026 · 12 min read
Discrete trial teaching (DTT) remains one of the most widely used instructional methods in applied behavior analysis for individuals with autism spectrum disorder. Its structured format — clear discriminative stimulus, response opportunity, consequence — provides a teaching context with high degrees of control over learning variables, making it particularly effective for establishing new skills under conditions where incidental learning is insufficient. However, the implementation of DTT has evolved considerably from its earliest manualized forms, and a growing body of research distinguishes between conventional, protocol-driven approaches and progressive, learner-responsive approaches that make in-the-moment adjustments based on the learner's current behavior.
This session presents a direct comparison of conventional and progressive DTT frameworks, with particular attention to how in-the-moment assessment and clinical judgment function as controlling variables for the interventionist in progressive approaches. The clinical significance lies in the practical question every ABA practitioner faces: when following a predetermined protocol produces adequate learning, versus when it constrains clinical effectiveness, and how to make that determination reliably.
The conventional versus progressive distinction is not merely academic — it has direct implications for how BCBAs design and supervise DTT programs, how they train RBTs to implement those programs, and how they interpret data when learning is stalling. BCBAs who understand both approaches and the conditions under which each is most effective are better equipped to build individualized, responsive teaching systems rather than applying a single DTT model indiscriminately.
This training is positioned within the broader conversation about what evidence-based ABA practice looks like when client individuality is fully considered — a conversation that is increasingly central to the field's evolution.
DTT was formalized in the behavior analysis literature through the work of O. Ivar Lovaas and colleagues in the 1970s and 1980s, providing a systematic instructional model grounded in operant conditioning principles. The conventional DTT format established specific procedural parameters: a designated instructional SD, a fixed response requirement, a predetermined prompting hierarchy, and reinforcement delivery contingent on correct responding or prompt compliance. These parameters were initially operationalized in protocol manuals to ensure implementation consistency across interventionists.
Research over the following decades identified limitations of rigid protocol adherence. When the protocol fails to respond to the learner's current motivational state, attention, or learning history with specific prompts, it can produce rote responding, prompt dependence, and failure to generalize. Learners with variable motivational states or complex behavioral histories are particularly likely to benefit from in-the-moment responsiveness that a fixed protocol cannot provide.
Progressive DTT approaches emerged from research demonstrating that allowing interventionist behavior to be controlled by learner behavior — rather than exclusively by protocol specifications — could improve learning efficiency, reduce problem behavior during instruction, and support more natural-looking skill acquisition. These approaches incorporate real-time assessment of the learner's responding, dynamic adjustment of prompt levels and reinforcement procedures, and clinical decision-making that draws on the interventionist's knowledge of the individual learner.
The tension between standardization and individualization that conventional versus progressive DTT represents is not unique to DTT — it appears across the ABA clinical landscape wherever manualized protocols meet complex individual learners. Understanding this tension in the context of DTT provides a concrete case study applicable to broader questions of how BCBAs balance fidelity to evidence-based procedures with responsiveness to individual client needs.
The clinical implications of this comparison center on three practical questions: how to select the appropriate approach for a given learner and learning target, how to train and supervise DTT implementation that incorporates progressive elements, and how to interpret data from progressive approaches where implementation variability is inherent rather than a fidelity failure.
For learner-approach matching, the clinical decision involves assessing the learner's current reinforcer potency and variability, the history of prompt dependence or specific prompt failures with conventional approaches, the complexity of the learning target, and the learner's current behavioral engagement. Learners with highly variable motivational states, strong histories of specific prompt dependencies, or learning targets that require naturalistic contextual integration are typically better served by progressive approaches. Learners who respond efficiently to structured protocols and whose learning data reflect consistent progress may not require the additional clinical overhead of progressive implementation.
For training and supervision, progressive DTT implementation places greater demands on the interventionist's clinical judgment, requiring BCBAs to train RBTs not just in the mechanics of trial delivery but in the decision rules that govern in-the-moment adjustments. This requires more sophisticated supervision models — including live observation, video review, and structured debriefing — than protocol-only training. BCBAs must be able to articulate the decision rules clearly enough that RBTs can apply them consistently without collapsing into arbitrary moment-to-moment deviation.
For data interpretation, progressive DTT produces implementation variability that must be documented and interpreted rather than treated as error variance. Session notes should capture the specific in-the-moment decisions made — prompt level adjustments, reinforcer changes, pacing modifications — to allow data to be interpreted in context.
The ABA Clubhouse has 60+ on-demand CEUs including ethics, supervision, and clinical topics like this one. Plus a new live CEU every Wednesday.
The ethical considerations specific to conventional versus progressive DTT implementation center on Code 2.09 (Effective Treatment) and Code 2.01 (Providing Services with Competence). Code 2.09 requires BCBAs to recommend and implement treatments that are effective for their individual clients. When conventional DTT is producing inadequate learning — demonstrated by stalling acquisition data, emerging problem behavior during sessions, or generalization failures — continuing to use it unchanged because it is the specified protocol is not consistent with the obligation to provide effective treatment.
Code 2.01 requires competence in the methods being used. Progressive DTT implementation requires clinical judgment competencies — real-time behavioral assessment, prompt adjustment decision-making, reinforcer calibration — that go beyond procedural DTT competency. BCBAs who supervise progressive DTT implementation without adequate training in these clinical judgment skills are practicing beyond their competence in that specific domain, even if they are generally competent in DTT.
Code 2.12 (Advocating for Client Access and Services) is relevant when protocol constraints imposed by organizational policies or payer requirements are preventing implementation of a more clinically appropriate progressive approach. BCBAs have an obligation to advocate for their clients' right to effective treatment, which may require making a documented clinical case for implementation flexibility within their organization's or payer's approval processes.
Code 2.17 (Delivering Behavioral Services in Applied Settings) applies to the supervision of RBTs delivering DTT: BCBAs are responsible for ensuring that the supervision model supports the level of clinical judgment required by the implementation approach. Supervising progressive DTT through spot-check observation without structured debriefing is insufficient — the clinical decision points require observation and feedback to develop correctly.
Decision-making about conventional versus progressive DTT begins with a careful assessment of the current learning context. The key variables to assess are: Is the learner making consistent progress on the current acquisition target with the current conventional protocol? If not, what is the pattern of errors — random, systematic, or prompt-dependent? What is the learner's motivational state variation during sessions, and how does this variation relate to learning rate?
For learners showing systematic prompt dependence under conventional protocols, progressive approaches that probe for more independent responding prior to prompting — and that adjust prompt type and intensity based on real-time responding — can directly address this pattern. For learners showing within-session motivational variability, progressive approaches that adjust reinforcement procedure and pacing in response to engagement signals can maintain the motivational conditions for learning more effectively than a fixed reinforcement protocol.
In-the-moment assessment in progressive DTT involves reading behavioral indicators of engagement, motivation, and response confidence, and making calibrated adjustments based on decision rules developed through clinical experience and supervision. These rules should be made explicit — formalized as written decision guides — to support consistent implementation across sessions and across interventionists.
Data systems for progressive DTT should capture not just trial-level responding but session-level implementation notes that document the specific adjustments made. This provides the interpretive context needed to determine whether data changes reflect learner progress, implementation variability, or interactions between the two — and to make informed decisions about when to modify the progressive approach versus when to hold it stable.
Understanding the conventional versus progressive DTT distinction provides BCBAs with a more nuanced and practically useful instructional repertoire. Rather than applying DTT in a single form, practitioners who understand both approaches can select and calibrate their implementation based on the specific learning profile and current response pattern of each client — which is the clinical standard that individualized, effective ABA practice requires.
The session also reinforces the importance of treatment monitoring — the ongoing data-based review that determines when a current instructional approach is producing adequate learning and when clinical adjustment is warranted. BCBAs who monitor learning data systematically and apply clear decision rules for when to modify DTT implementation are providing higher quality clinical oversight than those who maintain protocols unchanged regardless of learner response.
For supervision practice, this content is a prompt to invest in training RBTs in clinical judgment, not just procedural execution. RBTs who understand the rationale for in-the-moment adjustments — not just the mechanics of conventional DTT — are better equipped to implement progressively and to recognize when learner responses signal a need for adjustment that should be escalated to the supervising BCBA.
Ready to go deeper? This course covers this topic in detail with structured learning objectives and CEU credit.
Conventional Versus Progressive DTT | Learning | 1 Hour — Autism Partnership Foundation · 1 BACB General CEUs · $0
Take This Course →All behavior-analytic intervention is individualized. The information on this page is for educational purposes and does not constitute clinical advice. Treatment decisions should be informed by the best available published research, individualized assessment, and obtained with the informed consent of the client or their legal guardian. Behavior analysts are responsible for practicing within the boundaries of their competence and adhering to the BACB Ethics Code for Behavior Analysts.