Rosalind Picard and her team at MIT launched several years ago research in the area of affective computing, computers that can interact with humans taking into account the emotional level of the person interacting with them. They were designed also in ways to appeal and stimulate interaction, like having an expressive face, moving eyes and modulated voice, all contributing to convey emotion.
Now Rosalind’s team published a paper reporting on their work and result in having the computer (disguised as a robot) to interpret the person’s emotion and consequently adapt their way of interacting.
The robot is equipped with deep learning algorithms that look at the various aspects of an ongoing interaction and are able to identify the feeling/emotion of the person they are interacting with. In particular the team has experimented with children with autism. Each of them is a case in itself, with different characteristics. A common trait of autism is the difficulty for that person to detect emotion in the person they are interacting with (sometimes they do not show any interest in detecting emotions in their counterpart). At the same time, most of these children are interested and at ease with computers and robots.
Hence the idea to help these children to establish an interaction that takes emotion into account using a robot. Since each one is a special “case” the robot needs to adapt to that child and to do that the robot has to “understand” that child and adapt its behaviour to the child reaction. Typically the child and robot (see the clip) engage in some activity and the robot shows off its emotion, using its eyes, lights, movement and “watch” the reaction of the child adapting its behaviour to help the child recognising the emotions that its trying to demonstrate. The robot interpret the child reaction using visual and aural clues, as well as information on the child heart beat, skin conductivity and movement provided by a sensor embedded in a wearable band on the child wrist.
This can be used as a tool by the therapist helping the child. The first trials have shown a capability of correctly detecting the emotion of the children with a probability of 60%, that is higher than the 50-55% probability of an expert therapist (the indications provided by the bio sensor surely help in this performance).