Scientists have developed an artificial intelligence system that can allow robots to interact with autistic children undergoing therapy. People with autism see, hear and feel the world differently from other people, which affects how they interact with others. This makes communication-centered activities quite challenging for children with autism spectrum conditions (ASC).
To address this challenge, therapists recently began to use humanoid robots in therapy sessions. However, existing robots lack the ability to autonomously engage with children, which is vital for improving the therapy. The fact that people with ASC have atypical and diverse styles of expressing their thoughts and feelings makes the use of such robots even more challenging. Researchers from the Massachusetts Institute of Technology in the US created a personalized machine learning framework for robots used during autism therapy.
The framework, described in the journal Science Robotics, helps robots automatically perceive the affect — facial, vocal and gestural behavior — and engagement of children as they interact with them. To achieve this exciting advance, project partners had realized that in the case of children with ASC, one size does not fit all. As a result, they personalized their framework to each child using demographic data, behavioral assessment scores and other characteristics unique to that child.
The novel framework enabled the robots to automatically adapt their interpretations of children’s responses by taking into account the cultural and individual differences between them. “The challenge of creating machine learning and AI (artificial intelligence) that works in autism is particularly vexing because the usual AI methods require a lot of data that are similar for each category that is learned,” said Rosalind Picard, a professor at MIT.
“In autism where heterogeneity reigns, the normal AI approaches fail,” said Picard. The researchers tested their model on 35 children from Japan and Serbia. Aged 3 to 13, the children interacted with the robots in 35-minute sessions. The humanoid robots conveyed different emotions — anger, fear, happiness, and sadness — by changing the color of their eyes, the tone of their voice and the position of their limbs.
As it interacted with a child, the robot would capture video of their facial expressions, movements, and head pose, as well as audio recordings of their tone of voice and vocalizations. A monitor on each child’s wrist also provided the robot with data on their body temperature, heart rate, and skin sweat response.
The data was used to extract the child’s various behavioral cues and was then fed into the robot’s perception module. Using deep learning models, the robot then estimated the child’s effect and engagement based on the extracted behavioral cues. The results were used to modulate the child-robot interaction in subsequent therapy sessions.
Audiovisual recordings of the therapy sessions were also observed by human experts. Their assessments of the children’s responses showed a 60 percent correlation with the robots’ perceptions. This was a higher agreement level than achieved between human experts. The study’s results suggest that trained robots could play an important role in autism therapy in the future.