They created a game to show how webcams ‘read’ human emotions



[ad_1]

The emojify.info web game aims to show how a computer can
The emojify.info web game aims to show how a computer can “read” human emotions through the camera and to show that our emotions are not related to facial gestures.

Does Machine Artificial Intelligence (AI) Emotion Recognition Really Work? A question of this style was asked by a team of researchers at the University of Cambridge in the United Kingdom. They started from a premise: Tech companies don’t just want to identify people using facial recognition, they also try to read their emotions using AI.

For many scientists, however, claims about the ability of computers to understand emotions are wrong. It is therefore that the researcher Alexa hagerty of the Center Leverhulme for the future of intelligence Cambridge University and the Center for the Study of Existential Risks (TERN), designed a web game for browsers trying to expose AI.

The purpose of the emojify.info site is to show how a computer can “read” human emotions using the webcam. The game challenges to produce six emotions different (happiness, sadness, fear, surprise, disgust and anger), for the AI ​​to try to identify.

emojify.info challenges to produce six emotions: happiness, sadness, fear, surprise, disgust and anger, for the AI ​​to try to identify (Infobae)
emojify.info challenges to produce six emotions: happiness, sadness, fear, surprise, disgust and anger, for the AI ​​to try to identify (Infobae)

However (spoiler alert), people will find the software’s readings inaccurate, even calling euphoric phrases “neutral.” Hagerty wants to show that it is wrong that facial movements are directly related to changes in feelings.

“The premise of these technologies is that our faces and inner feelings are correlated in a very predictable way,” the researcher told the site. The edge. Then, if a person smiles, is he happy? If she frowns, is she angry? “APS (Association for Psychological Science) conducted a thorough review of the evidence in 2019 and found that people’s emotional space cannot be easily deduced from their facial movements “.

In the game, you can move your face and gestures quickly to try and express six different emotions. “But the point is, you haven’t felt six different things internally, one after the other in a row.”, says the researcher.

Another mini-game on the site asks users to identify the difference between a blink of an eye and a blink, which computers cannot do. “You can close your eyes and it can be an unintentional action or a meaningful gesture”, dice Hagerty.

Emotion recognition technology is developing rapidly and these systems are used to screen job applicants, for example, or to detect possible terrorists at an airport. Additionally, as with facial recognition systems, emotion sensing AI often has racial or even gender bias. “The dangers are multiple”, Warns Hagerty.

Months ago, the Advisory Committee of the The Council of Europe’s Data Protection Convention recommended that “certain uses of facial recognition technologies” be prohibited to “avoid any risk of discrimination”, within the framework of the international day of the protection of the data, on January 28th.

The committee published guidelines in which it spoke out against the use these technologies for determine skin color, religion, sex, race, ethnicity or health status information. He also noted: “Private companies should not be allowed to use facial recognition in uncontrolled environments, such as shopping malls.”, for marketing or private security purposes.

According to the Committee, their use should only be authorized “In controlled environments for authentication or categorization”, such as access to a concert with VIP tickets.

KEEP READING:

Siri will no longer have a female voice by default: this will allow you to change it to that of a male virtual assistant
Goodbye a classic: Yahoo! Responses will be closed forever on May 4



[ad_2]
Source link