Design of a mobile application based on artificial intelligence to identify pain in non-communicating individuals with cerebral palsy.
A stakeholder-built phone app that reads pain faces in nonverbal CP is coming—stay tuned for proof it works.
01Research in Context
What this study did
Sabater-Gárriz et al. (2025) asked parents, nurses, and doctors what pain looks like in people with cerebral palsy who cannot speak.
They pooled every study they could find on facial pain cues.
The team then built a free phone app that uses AI to read those cues in real time.
What they found
The prototype won thumbs-up from all stakeholder groups.
No validation data yet—the app is ready for field testing.
How this fits with other research
Rabin et al. (2019) already showed that computer face-scoring (FACET) tracks emotion in autism, so the tech leap to pain in CP is logical.
Adams et al. (2021) proved a plain 2-D camera and free OpenPose code can score body imitation with lab-level accuracy—good news for Álvaro’s plan to keep the new app cheap and phone-based.
Mosalmannejad et al. (2025) muddies the water: they found that alexithymia, not autism, blocks pain-face reading. Their study warns us that any AI must be trained on faces of people who truly cannot label pain, not just any neurodivergent group.
Why it matters
If the app passes validation, you could finally know when a nonverbal client’s face says “hurt” before problem behavior starts. Until then, track the pilot studies and keep your low-tech pain checklists handy.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Follow Álvaro et al. on social media so you can join the first validation trial and test the beta app with one client.
02At a glance
03Original abstract
INTRODUCTION: Pain assessment in individuals with cerebral palsy (CP), particularly those unable to self-report, is a significant challenge. Pain is the most common comorbidity in CP, yet current evaluation methods are often subjective and unreliable. An AI-based facial recognition system integrated into a mobile application could provide an objective, reliable tool for pain assessment in this population. OBJECTIVES: METHODS: Three approaches were employed: RESULTS: A systematic review identified seven studies on automated facial recognition systems for pain detection. However, only one of these systems-ePAT/PainCheck-has been developed into a functional mobile application for clinical use, though not specific to individuals with cerebral palsy. This underscores the novelty of the current initiative. The feasibility of our proposed app was confirmed, and key technical and functional requirements were outlined, including intuitive design, dual local/cloud processing, and mechanisms for system improvement. Stakeholders emphasized ease of use, and suggested incorporating features such as accuracy estimation, offline functionality, multi-language support, and open communication fields. CONCLUSIONS: This novel and feasible app represents a significant advance in pain assessment for CP, with potential applications in other neurological conditions with communication impairments and unique facial expressions.
Research in developmental disabilities, 2025 · doi:10.1016/j.ridd.2025.105058