[ad_1]
by: Notimex – June 29, 2018, 08:04 AM
MEXICO .- In an important advance in affective computing, scientists at the Massachusetts Institute of Technology ( MIT, for their acronyms in English) have developed a personalized machine learning system that will offer children with autism better therapeutic outcomes to recognize the emotional states of the people around them.
In general, children with autism spectrum disorder example, between an expression of happiness and fear. To remedy this, some therapists use a friendly robot to show children emotional differences, to imitate them and respond appropriately.
However, this therapy would work better if the robot interpreted the behavior of the child, the action in which the researchers working at the MIT Media Lab works. With this system, the robots will be able to estimate the commitment and the interest of each child during the therapeutic activities, by using data specific to each child.
Rosalind Picard, a co-author and professor at MIT who leads the research in affective computing, says that personalization is especially important in autism therapy, and he pointed out that a system of "autism" is the most important thing in autism therapy. In-depth learning uses hierarchical and multiple layers of data processing to improve their tasks. It was used primarily in automatic speech recognition programs and objects
Until now, robot-assisted autistic children's therapies consist of a therapist showing pictures of children or young people. different face cards designed to represent different emotions. Children recognize expressions of fear, sadness or joy. Then the therapist programs the robot to show those same emotions to the child, and observes the child while interacting with the robot.
MIT researchers have now used SoftBank Robotics' NAO robots, which are two feet tall. and looks like an armed superhero or a droid. The robot transmits different emotions by changing the color of his eyes, the movement of his extremities and the tone of his voice.
To test the system, they performed tests with 35 children with autism, 17 from Japan and 18 from Serbia. three to 13 years old. During their 35-minute sessions, they reacted in various ways to robots, boring and sometimes sleepy. Get out of the room with enthusiasm, applaud and laugh or touch the robot.
The researchers captured videos of each child's facial expressions, head and body movements, poses and gestures, audio recordings, and heart rate data. body temperature and the reaction of the skin to the sweat of a monitor on the wrist of the child.
Networks of deep personal learning robots were built from these video, audio and physiological data, the diagnosis and abilities of autism, culture and gender of the. child.
Subsequently, researchers compared their estimates of children's behavior with the estimates of five human experts, who encoded children's audio and video recordings. a continuous scale to determine how happy or upset the child was, how interested he was and how he was engaged during the session
. s custom data coded by the man, and tested with unused data in the training or adjustment models, the networks have significantly improved the automatic estimation of the behavior of the child by the robot for most children in the study.
children in the study reacted to the robot "not only as a toy, but as a real person", especially during the storytelling, where therapists asked how NAO would feel if the children were taking the robot for an ice cream, according to Oggi Rudovic, a postdoctoral student at the Media Lab.
This technological project was funded by the Ministry of Education, Culture, Sports, Science and Technology of Japan, Chubu University (Japan) and HORIZON 2020 of the European Union.
[ad_2]
Source link