Service Delivery

Autism and emotion recognition technologies in the workplace.

Katirai (2025) · Autism : the international journal of research and practice 2025
★ The Verdict

Emotion-recognition surveillance can misread autistic faces and penalize normal autistic expressions, so demand autistic-tested proof before purchase.

✓ Read this if BCBAs writing job-support plans for autistic adults or consulting on workplace tech.
✗ Skip if Clinicians who only serve early-childhood clients.

01Research in Context

01

What this study did

Katirai (2025) looked at emotion-recognition cameras and software used in offices.

The review asked: do these tools hurt autistic workers?

It pulled together policy papers, tech reports, and workplace studies.

02

What they found

The tech often labels autistic facial expressions as angry or sad when they are not.

This can lower job ratings or trigger discipline.

The paper says: test the tool with autistic staff before you buy it.

03

How this fits with other research

Solomon (2020) already showed that boss bias, not skill, blocks autistic hiring. Amelia adds a new block: biased software.

Emerson et al. (2023) found autistic applicants scored lower on video interviews. Amelia says the same bias is now baked into code.

Vassos et al. (2023) report that autistic workers speak up about company problems. Amelia warns emotion cameras could silence this strength by making staff afraid to look different.

04

Why it matters

If you help adults with autism find or keep jobs, check any emotion-surveillance tool first. Ask vendors for autistic-tested proof. If none exists, push for a pilot run with autistic employee input. Protecting their privacy keeps their talent in the workplace.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Email HR to see if any emotion-recognition cameras are planned; if yes, request pilot data with autistic staff.

02At a glance

Intervention
not applicable
Design
narrative review
Population
autism spectrum disorder
Finding
not reported

03Original abstract

The use of emotion recognition technologies in the workplace is expanding. These technologies claim to provide insights into internal emotional states based on external cues like facial expressions. Despite interconnections between autism and the development of emotion recognition technologies as reported in prior research, little attention has been paid to the particular issues that arise for autistic individuals when emotion recognition technologies are implemented in consequential settings like the workplace. This article examines recent literature on autism and on emotion recognition technologies to argue that the risks of the use of emotion recognition technologies in the workplace are heightened for autistic people. Following a brief overview of emotion recognition technologies, this argument is made by focusing on the issues that arise through the development and deployment of emotion recognition technologies. Issues related to the development of emotion recognition technologies include fundamental problems with the science behind the technologies, the underrepresentation of autistic individuals in data sets and the problems with increasing this representation, and annotation of the training data for the technologies. Issues related to implementation include the invasive nature of emotion recognition technologies, the sensitivity of the data used, and the imposition of neurotypical norms on autistic workers through their use. The article closes with a call for future research on the implications of these emergent technologies for autistic individuals.Lay abstractTechnologies using artificial intelligence to recognize people's emotional states are increasingly being developed under the name of emotional recognition technologies. Emotion recognition technologies claim to identify people's emotional states based on data, like facial expressions. This is despite research providing counterevidence that emotion recognition technologies are founded on bad science and that it is not possible to correctly identify people's emotions in this way. The use of emotion recognition technologies is widespread, and they can be harmful when they are used in the workplace, especially for autistic workers. Although previous research has shown that the origins of emotion recognition technologies relied on autistic people, there has been little research on the impact of emotion recognition technologies on autistic people when it is used in the workplace. Through a review of recent academic studies, this article looks at the development and implementation processes of emotion recognition technologies to show how autistic people in particular may be disadvantaged or harmed by the development and use of the technologies. This article closes with a call for more research on autistic people's perception of the technologies and their impact, with involvement from diverse participants.

Autism : the international journal of research and practice, 2025 · doi:10.1177/13623613241279704