Posted in | News | Humanoids | Medical Robotics

Novel Robot May Help Improve Social Behaviors for Children with ASD

Robots and humans socialize frequently in pop fiction — think of Wall-E and Star Trek: The Next Generation. Now, a UT Dallas researcher is giving the fantasy of robotic friends a practical edge with a robot that teaches social skills to children with Autism Spectrum Disorder (ASD).

The robot can sense when a child begins to get frustrated or agitated and can react accordingly.

Dr. Pamela Rollins, associate professor in the School of Behavioral and Brain Sciences, explained that individuals with ASD often have social anxiety. Learning social interactions via a less threatening interface — a robot — may help patients better identify emotions and use specific social skills with humans, like holding a conversation.

“Some preliminary data has shown that individuals with autism start talking to the robots when they don’t talk to other people,” Rollins said.

Rollins, who conducts research at the Callier Center for Communication Disorders, is working with a team of autism experts and robotics designers at the company Robokind to create Robots4Autism. This program uses an artificially intelligent robot with a full range of facial expressions to interact with children who have ASD.

According to autism advocacy group Autism Speaks, as children with ASD develop, they may have varying degrees of difficulties in engaging in normal social situations. Although they may share a connection with a person, such as a parent, they don’t show the normal behaviors that would demonstrate this affection, like hugging or smiling at them. This difficulty in demonstrating accepted social norms can continue into their adult lives.

When used in conjunction with traditional therapies, Robots4Autism may improve social behaviors and interactions for children with ASD.

“It’s not to replace therapy with humans, but you can deliver a social skills lesson in a less threatening way, and the robot can deliver the same lesson multiple times,” Rollins said.

During a lesson, the robot explains a social situation to the child with ASD. They then watch a video of the described social situation together, during which the robot comments on the appropriate behaviors displayed by the actors, reinforcing the previous explanation. As a final test, the child watches short videos of the correctly modeled behavior or one with errors, and then discusses.

The robot can sense when a child begins to get frustrated or agitated and can react accordingly. There is even a module designed to teach children how to calm themselves down when they’re agitated. It can also progress children through lessons as they master modules focusing on different social situations, such as how to greet someone or how to interact at a birthday party.

Rollins said the next step is to begin testing the effectiveness of Robots4Autism after the final programming is finished in June.

UT Dallas alumna and speech language pathologist Michelle N. McFarlin and Dr. Carolyn Garver, program director of the Autism Treatment Center, are working with Rollins to develop the curriculum for Robots4Autism. Rollins is a gubernatorial appointee to the Texas Council on Autism and Pervasive Developmental Disorders.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.