The impact of structured assessment attempts and instructor feedback on clinical competency development in dental assistant training.
Three quick practice shots with feedback after each one reliably push trainees to mastery.
01Research in Context
What this study did
RFerguson et al. (2025) tested a simple rule: let dental assistant trainees try each skill three times and get instructor feedback after every attempt.
They ran the cycle like this: attempt, feedback, attempt, feedback, attempt, final feedback. No control group, just pre-test scores to beat.
What they found
Every trainee jumped from low scores to mastery level. The class ended with a 100% pass rate on the final exam.
The gains were large and steady across all measured skills.
How this fits with other research
Moss et al. (2009) meta-analysis of 55 studies already said the same thing: pair in-service training with on-the-job feedback and you get the biggest staff gains. RF’s three-attempt loop is a clean way to do exactly that.
Romani et al. (2023) copied the idea with psychiatric staff writing progress notes. Feedback added to lecture turned poor notes into accurate ones for three of four staff. Same recipe, new task.
Blackman et al. (2022) looked like a clash at first. They showed that watching and scoring alone rarely lifted staff performance. The twist: they never added instructor feedback. RF’s strong results prove the missing piece was feedback, so the studies actually agree.
Why it matters
You can borrow the three-attempt rule tomorrow. Pick one skill your RBTs struggle with, give them three rapid trials, and drop brief feedback after each. The dental data say you will see fast mastery without extra hours of lecture.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Pick one protocol, run three consecutive practice trials, and give one piece of feedback after each trial.
02At a glance
03Original abstract
<h4>Background</h4>This research investigated the effectiveness of a structured three-attempt formative assessment model combined with instructor feedback in improving clinical competency among first-semester dental assistant diploma students.<h4>Methods</h4>Ten dental assistant students underwent repeated assessment attempts across eight operative and endodontic procedures during the Fall 2023 semester of the Dental Assistant Diploma program. After each attempt, students were given performance feedback reinforcing elements they performed well, and guiding their improvement in areas that were less well executed. Outcomes across the three attempts were then compared to determine if repeated feedback was associated with improved performance over time.<h4>Results</h4>Student progress with this structured feedback approach was associated with significant and consistent improvement of performance scores across the three assessments (F = 577.74; p < 0.000), along with universally high logbook grades across students and a 100% pass rate in the final competency exam. Statistical analysis revealed a strong correlation (Pearson's r = 0.94, p < 0.000) between logbook scores and final competency exam scores, which is notable given these two assessment types were conducted by independent examiners.<h4>Conclusions</h4>These findings suggest that structured feedback-driven assessment plays a pivotal role in competency-based education for dental assistants. In addition, this research highlights specific design elements that may be fundamental in the success of such an approach.
, 2025 · doi:10.1186/s12909-025-07842-z