The DREAM Dataset: Supporting a data-driven study of autism spectrum disorder and robot enhanced therapy.
A free 300-hour treasure chest of synced motion, gaze, and ADOS data is ready for your next robot-vs-human analysis.
01Research in Context
What this study did
Billing et al. (2020) built a giant open dataset. They filmed 61 autistic kids during robot and human ABA sessions.
Cameras caught every hand flap, eye shift, and step in 3-D. They synced the video with ADOS scores and gaze data.
What they found
The paper does not give win-loss numbers. It simply hands you 300-plus hours of ready-to-code behavior.
How this fits with other research
Rakhymbayeva et al. (2021) mined similar robot footage and found kids stay more engaged when tasks feel familiar. The new data lets you test that idea on a much bigger pile of clips.
Whiteside et al. (2022) sent a dancing robot into living rooms and saw no extra gain from fancy multi-role moves. Their home clips echo the lab data now sitting in the DREAM bank, so you can compare living-room vs clinic movement patterns.
Boudreau et al. (2015) saw higher smiles with a Keepon robot yet no jump in thinking skills. The fresh dataset keeps the same robot-vs-human frame but adds gaze and motion layers, letting future studies dig deeper into why smiles do or do not turn into learning gains.
Why it matters
If you ever wished for clean, long clips of autistic kids in both robot and human therapy, the work is done. Download, slice, and run machine-learning scripts to spot what boosts engagement or reduces stereotypy. Use the codes to train staff or to power your next grant.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Download five sample clips, pick one target behavior, and practice reliability coding with your team.
02At a glance
03Original abstract
We present a dataset of behavioral data recorded from 61 children diagnosed with Autism Spectrum Disorder (ASD). The data was collected during a large-scale evaluation of Robot Enhanced Therapy (RET). The dataset covers over 3000 therapy sessions and more than 300 hours of therapy. Half of the children interacted with the social robot NAO supervised by a therapist. The other half, constituting a control group, interacted directly with a therapist. Both groups followed the Applied Behavior Analysis (ABA) protocol. Each session was recorded with three RGB cameras and two RGBD (Kinect) cameras, providing detailed information of children’s behavior during therapy. This public release of the dataset comprises body motion, head position and orientation, and eye gaze variables, all specified as 3D data in a joint frame of reference. In addition, metadata including participant age, gender, and autism diagnosis (ADOS) variables are included. We release this data with the hope of supporting further data-driven studies towards improved therapy methods as well as a better understanding of ASD in general.
PLoS ONE, 2020 · doi:10.1371/journal.pone.0236939