Designing Custom Robots for Autism Therapy



[ad_1]

In a major breakthrough in affective computing, scientists at the Massachusetts Institute of Technology (MIT) have come up with a personalized computer-based learning system that will have better therapeutic outcomes for children with autism so to recognize the emotional states of people.

In general, children with autism spectrum disorders have trouble distinguishing, for example, between the expression of happiness and fear. To remedy this, some therapists use a friendly robot to show children emotional differences, to imitate them and respond appropriately.

However, this therapy would work better if the robot interpreted the child's behavior as one who works the researchers at the MIT Media Lab. With this system, robots will be able to estimate the commitment and interest of each child during therapeutic activities, using data specific to each child.

Rosalind Picard, a co-author and professor at MIT who directs research in affective computing, says that personalization is especially important in autism therapy, and he pointed out that a system of "autism" is not a problem. In-depth learning uses hierarchical and multiple layers of data processing to improve their tasks.

It was used mainly in automatic speech recognition programs and objects. Until now, therapies for robot assisted autistic children consist in that a therapist shows pictures of children or different face cards intended to represent different emotions, so that children recognize expressions of fear, sadness or joy. Then the therapist programs the robot to show those same emotions to the child, and observes the child while interacting with the robot.

MIT researchers have now used SoftBank Robotics' NAO humanoid robots, which measure two feet and look like an armed superhero or droid. The robot transmits different emotions by changing the color of his eyes, the movement of his extremities and the tone of his voice.

To test the system, they tested 35 children with autism, 17 from Japan and 18 from Serbia, from three to 13 years old. During their 35-minute sessions, they reacted in various ways to robots, boring and sometimes sleepy. Get out of the room with enthusiasm, applaud and laugh or touch the robot.

Researchers captured videos of facial expressions, head and body movements, poses and gestures, audio recordings and heart rate data from each child. body temperature and the skin's response to the sweat of a monitor on the wrist of the child.

Networks of deep personal learning robots were built from this video, audio and physiological data, diagnosis and abilities of autism, culture and gender of the child .

Subsequently, researchers compared their estimates of children's behavior with the estimates of five human experts, who coded children's audio and video recordings. a continuous scale to determine how happy or upset the child was, how interested he was and how he engaged during the session.

Article

Many foods can not be digested by their system and it is better to remove them from their diet

  Foods that help people with autism   Foods that help people with autism [19659015] This custom data, coded by the man, and tested with data not used in the training or the adjustment of the models, have significantly improved the automatic estimation of the behavior of the child by the robot for the majority of children in the study. </p>
<p>  children in the study reacted to the robot "not only as a toy, but as a real person", especially during the narrative, where therapists asked how NAO would feel if the children were taking the robot for an ice cream, according to Oggi Rudovic, a postdoctoral student at the Media Lab. </p>
<p>  This technological project was funded by the Ministry of Education, Culture and Sports. is, Science and Technology of Japan, University of Chubu (Japan), and HORIZON 2. </p>
<p>  ld </p>
</div>
</pre>
</pre>
[ad_2]
<br /><a href=Source link