[ad_1]
Scientists from Massachusetts Institute of Technology (MIT) developed a personalized machine learning system for therapeutic purposes for children with autism in order to work on the ability to recognize the emotional states of the people around them.
Autism is a psychological disorder characterized by the intense concentration of a person in his own inner world and, therefore, the gradual loss of contact with reality. Therefore, children with autism have problems to distinguish, for example, between an expression of happiness and fear.
You might be interested: China will control citizens with "Spy Birds"
In this case, some therapists use a friend robot to show children emotional differences, so that they imitate and respond appropriately.
However, some adjustments still need to be made to the robot to make the work more efficient, as it interprets the behavior of the child, action in which researchers work.
Rosalind Picard co-author and professor of MIT who conducts research in affective computing, says this personalization is particularly important in autism therapy, and pointed out that An in-depth learning system uses hierarchical and multiple layers of data processing to improve their tasks. The process of therapy with NAO robots:
First, the therapist shows pictures of children or different face cards intended to represent different emotions that the child will make the effort to recognize expressions of fear, sadness or joy.
Thereafter, the therapist programs the robot to show these same emotions to the child and during this time he can observe the child by interacting with him. robot
These humanoids of SoftBanck Robbotics measure two feet and look like an armed superhero or a droid. To convey the different emotions, the robot changes the color of his eyes, the movement of his extremities and the tone of his voice.
Before starting the therapies, tests were conducted with 35 children with autism, 17 Japan and 18 of Serbia . between 3 and 13 years old, with 35-minute sessions, in which they showed different reactions to the robot, some bored, others applaud, jump out of the room with enthusiasm, touch the robot and laugh .
the facial expressions of each child, head and body movements, poses and gestures, audio recordings and data on heart rate, body temperature and sweat of a monitor on the wrist of the child
With these data, deep custom learning networks of robots have been realized. Most children in the study reacted to the robot "not just as a toy, but as a real person", especially during the narrative, where therapists asked how they would feel NAO if children take the robot for an ice cream, as explained Oggi Rudovic postdoctoral student at Media Lab.
This project was funded by the Ministry of Education, Culture, Sports, Science and Technology of Japan, University of Chubu (Japan) and HORIZON 2020 of the 39th. ;European Union.
[ad_2]
Source link