They create an improved AI system to perceive emotions



[ad_1]

Researchers at the Massachusetts Institute of Technology (MIT) have developed an Artificial Intelligence (AI) artificial learning model that allows computers to interpret human emotions with greater precision and work to improve their use in clinical practice and monitor the state of health of a person.

The experts explained in a statement that traditional models of affective computing use a concept of "one size fits all" that involves "driving" the systems. Artificial intelligence with a set of images representing different facial expressions, optimizing features and "mapping" people's expressions.

To advance in the field of "affective computing", MIT researchers have improved the system combining a technique called "expert mix" (series of models of neural networks) with another called model customization , which allows extracting "finer" data from the facial expressions of individuals.

Affective Computing

In the growing field of "emotional calculus" They develop robots and computers to analyze facial expressions, interpret emotions and respond accordingly. Applications include, for example, monitoring the health and well-being of a person, assessing pupils' interest in the classroom, helping to diagnose signs of illness and developing companion robots that improve the quality of life. life of children or the elderly,

MIT unveiled the model of machine learning that seeks to respond to the challenges of interpreting expressions taking into account cultures, genders and groups of people. 39, age, in addition to other more specific factors to interpret different types of feelings. 19659002] During a lecture on machine learning and data mining, Oggi Rudovic, a researcher at MIT's Media Lab and participant in the project, explained that it was important for him to learn more about it. is an important means of monitoring mood states

. social intelligence, you have to do them intelligently and naturally to respond to our moods and emotions, "he pointed out

In the technique used to improve these systems, models are trained to specialize in processing tasks and worked with individual video recordings of a database called Recola, a public tool of people conversing on a video platform designed for affective computing applications

Scientists shaped the model using the subjects' attitudes. In the mix they used as "experts" (parts of the computer system), each was programmed with one of the 18 individual video recordings of the Recola and recorded each person's facial expressions.

Possible Uses [19659012] The system, they said, can be run on the back of a computer or a device l Mobile to track video conversations of a user and learn subtle changes in facial expression in different contexts. "It can make smart phone apps smart or websites that can tell people how they feel and recommend ways to deal with stress or pain," said Michael Feffer, co-author of the development. It can also help detect depression or dementia. "Being able to control our facial expressions passively, over time we can customize these patterns for users and control the amount of deviations per day, by deviating from the average level of facial expression, and use it for the indicators. of well-being and health, "Rudovic added.

A version of the system, the illustrated specialist, was used to help the robots better interpret the moods of children with autism.

[ad_2]
Source link