Continuous Performance Feedback: Investigating the Effects of Feedback Content and Feedback Sources on Performance, Motivation to Improve Performance and Task Engagement
A real person delivering quick tips alongside data keeps staff accurate, eager, and on task better than any screen.
01Research in Context
What this study did
The team put 90 college students through a boring computer task. Each click earned points only if it met a hidden rule.
Students got feedback after every 10 clicks. The feedback varied in two ways: numbers only (quantitative) or numbers plus tips (qualitative), and it came from a person, a computer pop-up, or nowhere at all.
The researchers tracked how many correct clicks students made, how much students said they wanted to improve, and how long they stayed on task.
What they found
Face-to-face feedback beat every other setup. Students who heard comments from a real person scored higher, tried harder, and stuck with the task longer.
Qualitative feedback gave an extra boost. When the person added short tips like “Try clicking faster,” motivation and engagement jumped even more than with numbers alone.
Computer pop-ups helped a little, but not as much as a human voice. No-feedback students slid downhill on all three measures.
How this fits with other research
Findley et al. (1965) showed the same pattern with chimpanzees. Mid-ratio clicks that produced a flashing light kept the animals working; Giamos shows the same rule works for adult humans when the “light” is a supervisor’s voice.
Al-Nasser et al. (2019) removed feedback entirely and still hit high fidelity with picture packets. That seems to clash with Giamos, but the tasks differ: packets teach one-off skills, while continuous feedback keeps a repeated task alive. Both papers agree—when you want steady performance, add ongoing input.
Knopp et al. (2023) found telehealth and in-person DTT worked equally well for kids with autism. Giamos tilts the other way: humans beat screens for staff feedback. The difference is social function; kids learn labels from either channel, but adults read social meaning into a live voice and try harder.
Why it matters
Stop emailing dashboard graphs to RBTs. Walk over, smile, and give one sentence of numbers plus one tip. This tiny habit can lift accuracy, enthusiasm, and session stamina in one shot. Try it for one week—track correct responses and note how long staff stay on task. You should see the same lift Giamos found, without buying new software.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Add one short tip to every in-person feedback you give—no extra time, bigger payoff.
02At a glance
03Original abstract
ABSTRACTOrganizations are increasingly replacing performance ratings with continuous feedback systems. The current study assesses how people react to continuous performance feedback in terms of its content and sources concerning their performance, motivation to improve, and task engagement. A task-based experiment was conducted with 36 participants who received continuous feedback. The participants were divided into two groups, receiving either quantitative or qualitative feedback content. Feedback was delivered through computer-mediated, person-mediated, or no source. The results highlight that person-mediated feedback, regardless of content, positively influenced performance, motivation, and task engagement. On the other hand, quantitative feedback only showed a positive association with performance. These findings suggest that qualitative feedback is more effective, enhancing motivation and engagement. Managers should prioritize person-mediated feedback to optimize performance, as it yields superior outcomes compared to computer-mediated feedback. However, further research is required to comprehensively understand the effectiveness of continuous performance feedback and its specific characteristics.KEYWORDS: Continuous performance feedbackfeedback contentfeedback sourcesmotivationperformance managementtask engagementperformance Disclosure statementNo potential conflict of interest was reported by the authors.Notes1. The sample size for our analyses on performance, motivation to improve performance following feedback is 36. Our sample size for our analysis on task engagement was reduced to 32 because of low quality recordings for some participants.2. Please consult Tables 2–5 for descriptive and inferential statistics tables.Additional informationFundingThe current project received master's funding from the Social Sciences and Humanities Research Council (SSHRC) and the Fonds de Recherche du Québec – Société et Culture (FRQSC).
Journal of Organizational Behavior Management, 2024 · doi:10.1080/01608061.2023.2238029