Helping autism with deep learning enhanced robots

Overview of the key stages (sensing, perception, and interaction) during robot-assisted autism therapy.
Data from three modalities (audio, visual, and autonomic physiology) were recorded using unobtrusive audiovisual sensors and sensors worn on the child’s wrist, providing the child’s heart-rate, skin-conductance (EDA), body temperature, and accelerometer data. The focus of this work is the robot perception, for which we designed the personalized deep learning framework that can automatically estimate levels of the child’s affective states and engagement. These can then be used to optimize the child-robot interaction and monitor the therapy progress (see Interpretability and utility). The images were obtained by using Softbank Robotics software for the NAO robot. Credit: Ognjen Rudovic et al./Science Robotics

Rosalind Picard and her team at MIT launched several years ago research in the area of affective computing, computers that can interact with humans taking into account the emotional level of the person interacting with them. They were designed also in ways to appeal and stimulate interaction, like having an expressive face, moving eyes and modulated voice, all contributing to convey emotion.

Now Rosalind’s team published a paper reporting on their work and result in having the computer (disguised as a robot) to interpret the person’s emotion and consequently adapt their way of interacting.

The robot is equipped with deep learning algorithms that look at the various aspects of an ongoing interaction and are able to identify the feeling/emotion of the person they are interacting with. In particular the team has experimented with children with autism. Each of them is a case in itself, with different characteristics. A common trait of autism is the difficulty for that person to detect emotion in the person they are interacting with (sometimes they do not show any interest in detecting emotions in their counterpart). At the same time, most of these children are interested and at ease with computers and robots.

Hence the idea to help these children to establish an interaction that takes emotion into account using a robot. Since each one is a special “case” the robot needs to adapt to that child and to do that the robot has to “understand” that child and adapt its behaviour to the child reaction. Typically the child and robot (see the clip) engage in  some activity and the robot shows off its emotion, using its eyes, lights, movement and “watch” the reaction of the child adapting its behaviour to help the child recognising the emotions that its trying to demonstrate. The robot interpret the child reaction using visual and aural clues, as well as information on the child heart beat, skin conductivity and movement provided by a sensor embedded in a wearable band on the child wrist.

This can be used as a tool by the therapist helping the child. The first trials have shown a capability of correctly detecting the emotion of the children with a probability of 60%, that is higher than the 50-55% probability of an expert therapist (the indications provided by the bio sensor surely help in this performance).

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.