Amazon says that he can detect the fear on your face. You are scared?



[ad_1]

Amazon announced a Monday's discovery of their artificial intelligence experts: their algorithms can now read the fear on your face, at a cost of $ 0.001 per image, or less if you process more than a million images.

The news has sparked interest because Amazon is at the center of a political debate about the accuracy and regulation of facial recognition. Amazon sells a facial recognition service, which is part of a suite of image analysis functions called Rekognition, to clients that include police services. Another recognition service tries to determine the sex of the faces on the photos. The company said Monday that the genre functionality had been improved – apparently a response to research showing that it was much less accurate for people with darker skin.

Rekognition evaluates the emotions of the faces on a sliding scale in seven categories: "happy", "sad", "angry", "surprised", "disgusted", "calm" and "confused". Fear, added Monday, is the eighth.

Amazon is not the first company to offer developers access to algorithms supposed to detect emotions. Microsoft has been offering similar offers since 2015; his department is looking for a similar emotional list, adding "contempt" but removing confusion. Google has been offering its own similar service since 2016.

Amazon declined to detail how customers use emotion recognition. Rekognition's online documentation indicates that the service "does not determine the person's internal emotional state and should not be used in this way." A part via in-depth consumer data, suggests that the stores could Incorporate live images of customers into their facial analysis tools to track the emotional and demographic trends of different outlets over time.

Even though Amazon, Google and Microsoft claim intuitive algorithms, psychologists warn that trying to read emotions from facial expressions is fundamentally flawed.

A study published in February by researchers at the University of Berkeley found that in order to be able to correctly read another person's emotions in a video, one had to pay attention not only to one's face but also to one's body language and to his environment. Software offered by technology companies typically analyze each face in isolation.

Another study, released last month, aimed more directly and devastating emotion detection software. Psychologists reviewed over 1,000 published results on facial expressions and emotions and concluded that there was no evidence that facial expressions reliably communicated the emotions by themselves, thus undermining the fundamental assumption of an emotion detection software.

"It is not possible to infer with certainty the happiness of a smile, the anger of a sullen glance or the sadness of a frown, because most of Current technologies are trying to do this when we apply what we mistakenly believe to be scientific facts, "the authors wrote.

An online demonstration of Google's cloud-based image analysis service shows how its artificial intelligence software tries to identify objects on photos and read facial expressions to discern the emotions.

Getty Images; Google

Rumman Chowdhury, who leads the work on responsible artificial intelligence at Accenture, said the situation was an example of the fact that the industry was not taking the break to think about the limitations of its technology. Even if a software could read faces accurately, the idea of ​​reducing the richness of human feelings into a handful of categories for all people and all contexts does not make much sense, she says. But the hype around the power of AI has caused many people, both inside and outside the tech sector, to be overconfident that computers can do.

"For most programmers, as long as the result is reasonable and the accuracy is satisfactory, it is considered correct," she says. According to Chowdhury, customers have learned that artificial intelligence is more powerful than ever.

As with facial recognition, easier access to emotion recognition algorithms seems to lead to a wider diffusion of technology, including in law enforcement.

In July, Oxygen Forensics, which sells software that the FBI and others use to extract data from smart phones, added facial recognition and emotion detection to its product. Lee Reiber, Oxygen's operations director, explains that these features have been added to help investigators sort through the hundreds or even thousands of images that often appear when collecting digital evidence.

Agents can now search for a specific face in a proof, or group images of the same person. They can also filter faces by race or age group and emotions such as "joy" and "anger". According to Reiber, visual tools can help investigators do their work faster, even if they are not perfect, and that the investigation process leads are always checked in several ways. "I want to take as many pieces as I can and put them together to paint a picture," he says.

The number of commercial emotion detection programs is increasing, but they do not seem to be widely used. Oxygen Forensics has added face recognition and emotion detection using Rank One software, a startup that has contracted with law enforcement. But when WIRED contacted Brendan Klare, CEO of Rank One, he did not know that Oxygen Forensics had implemented emotion detection in addition to facial recognition.

Klare says that the emotion sensor has not been proven so far. "The market is pretty limited at the moment, and we do not know if it will be profitable," he says. "It's not something that's so big at the moment."

The changing focus of the emotion recognition start-up Affectiva illustrates this challenge. The company was born in 2009 from an MIT project, which aimed to help people with autism understand the people around them. He has secured investor funds, including advertising giant WPP, and has launched products to help marketers measure public reaction to advertising and other content. More recently, the company has focused on improving the safety of cars, for example, using technology to detect sleepy or angry drivers. Affectiva announced funding of $ 26 million earlier this year, with auto parts maker Aptiv as lead investor. The company declined to comment.

At least one big tech company seems to have decided that the recognition of emotions was not worth it. IBM competes with Amazon and Microsoft for cloud computing and facial recognition, but does not offer emotion detection. A spokesman for IBM said the company did not plan to offer such a service.

Google does not offer facial recognition, a decision that it believes was made as a result of an internal ethics review, raising fears that technology will be used to violate privacy. But the company's cloud services will detect and analyze the faces on the photos, estimating age, sex and four emotions: joy, sorrow, anger and surprise.

Google indicates that its emotion detection features have gone through the same verification process as face recognition. The company also decided that it was acceptable to apply this technology to the personal photos of its users.

Searching for "happiness", "surprise" or "anger" in the Google Photos app will bring up images with the proper facial expressions. He will also search for "fear".


More great cable stories

[ad_2]

Source link