Assessment & Research

Automated measurement of repetitive behavior using the Microsoft Kinect: a proof of concept

Maharaj et al. (2020) · Behavioral Interventions 2020
★ The Verdict

A $100 Kinect camera counts repetitive movements with a large share accuracy—no wearables, no coding degree required.

✓ Read this if BCBAs who run baseline or treatment-probe sessions outside clinic hours or with clients who resist wearing sensors.
✗ Skip if Practitioners who already own high-accuracy wearable systems or need a large share precision for legal documentation.

01Research in Context

01

What this study did

The team parked a $100 Microsoft Kinect camera in a quiet room. One adult at a time stood in front of it and performed any repetitive movement they felt like—hand flapping, body rocking, finger waving—for three short trials.

Two trained humans counted every movement on video. A simple computer script did the same using only the Kinect’s depth map. The study asked: does the robot count match the people count?

02

What they found

The Kinect tally agreed with human observers a large share of the time. It slightly under-counted, missing about 1 in 12 movements, but the errors were tiny and steady.

No special calibration or fancy math was needed. Plug it in, press record, and you get usable data in minutes.

03

How this fits with other research

Gilchrist et al. (2018) got similar accuracy (80–a large share) with a $20 accelerometer taped to the wrist or chest. Kinect trades the sticker for a camera, giving hands-free tracking when clients won’t wear anything.

Lotfizadeh et al. (2020) pushed wearables further, hitting 94–a large share accuracy for self-injury in kids with ASD. Their extra precision came from individual machine-learning models—more setup than the one-size-fits-all Kinect script.

Takahashi et al. (2023) later used wall-mounted motion sensors and claimed a large share accuracy for 14 classroom behaviors. Their lab-grade gear costs more and needs careful room layout, while the Kinect works in an empty corner with no install.

04

Why it matters

If you need night, weekend, or in-home counts but staff can’t stay, park a Kinect and walk away. The a large share accuracy is good enough for baseline, progress, or fading checks without adding wearables that kids might rip off. Start with a short 5-minute probe, compare to your own count, and decide if the slight under-count still tells the story you need.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Borrow a Kinect, aim it at the client during one usual session, and compare its auto-count to your live tally—if the gap is small, you just gained an extra observer.

02At a glance

Intervention
not applicable
Design
other
Finding
positive

03Original abstract

AbstractThe Microsoft Kinect is a motion‐sensing device that enables users to interact with a computer through body movements. The Kinect was initially developed for video game users (e.g., Microsoft Xbox) although another important use may be as an automated system for measuring human behavior. To demonstrate the potential utility of the Kinect for behavioral measurement, we asked adults to perform various repetitive behaviors while in view of the Kinect sensor. Data collected from Kinect was analyzed via the Matrix Laboratory (MATLAB) program generating frequency of occurrences of repetitive behavior. To assess validity of the automated recording measurement, Kinect data were compared to frequency data obtained via direct human observation. Overall, there was a high‐level of agreement (92%) between measurement procedures (automated vs. human), although the automated system tended to result in slightly lower frequencies (false negatives) than was captured via human observation. Based on these findings, the Kinect technology showed solid potential as a tool for the automatic measurement of behavior. Suggestions for methodical refinement and future evaluation are discussed.

Behavioral Interventions, 2020 · doi:10.1002/bin.1746