Autism & Developmental

Fully robotic social environment for teaching and practicing affective interaction: Case of teaching emotion recognition skills to children with autism spectrum disorder, a pilot study

Soleiman et al. (2023) · Frontiers in Robotics and AI 2023
★ The Verdict

Watching two robots act out emotions can teach children with autism to read happy, sad, angry, and scared faces on real people.

✓ Read this if BCBAs running social-skills groups for young autistic clients.
✗ Skip if Teams without robot hardware or tablet access.

01Research in Context

01

What this study did

Two small robots acted out short chats in a lab. They showed happy, sad, angry, and scared faces while talking to each other.

Children with autism watched the robot pair. Later the kids named the same feelings on real people. The team used an ABAB reversal to check if the robot shows caused the gains.

02

What they found

Every child learned to spot the four emotions. They kept the skill one month later and used it with new adults.

The reversal proved the robot demos, not extra teaching, drove the change.

03

How this fits with other research

Kumazaki et al. (2019) first showed two robots can measure social skills. Soleiman et al. (2023) took the same two-robot setup and turned it into a lesson instead of a test.

Zhou et al. (2025) meta-analysis says robot tools give medium benefits, but the evidence is still thin. The new reversal data adds one strong single-case brick to that wall.

Yoshikawa et al. (2023) moved robot modeling online for adult job interviews. Together these studies trace a line: robot demos can teach social cues from childhood to adulthood.

04

Why it matters

You now have a cheap, repeatable script: let two robots chat and show faces, then test the child on real people. No extra staff, no fancy VR headset. Try it in your clinic next week—run a few robot clips, then probe emotion naming with photos or staff faces. Track corrects in an ABAB to be sure it works for each client.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Play a 2-minute clip of two robots showing feeling faces, then ask the child to name the same emotions on staff photos—graph correct answers.

02At a glance

Intervention
video modeling
Design
reversal abab
Population
autism spectrum disorder
Finding
positive

03Original abstract

21st century brought along a considerable decrease in social interactions, due to the newly emerged lifestyle around the world, which became more noticeable recently of the COVID-19 pandemic. On the other hand, children with autism spectrum disorder have further complications regarding their social interactions with other humans. In this paper, a fully Robotic Social Environment (RSE), designed to simulate the needed social environment for children, especially those with autism is described. An RSE can be used to simulate many social situations, such as affective interpersonal interactions, in which observational learning can take place. In order to investigate the effectiveness of the proposed RSE, it has been tested on a group of children with autism, who had difficulties in emotion recognition, which in turn, can influence social interaction. An A-B-A single case study was designed to show how RSE can help children with autism recognize four basic facial expressions, i.e., happiness, sadness, anger, and fear, through observing the social interactions of two robots speaking about these facial expressions. The results showed that the emotion recognition skills of the participating children were improved. Furthermore, the results showed that the children could maintain and generalize their emotion recognition skills after the intervention period. In conclusion, the study shows that the proposed RSE, along with other rehabilitation methods, can be effective in improving the emotion recognition skills of children with autism and preparing them to enter human social environments.

Frontiers in Robotics and AI, 2023 · doi:10.3389/frobt.2023.1088582