[ad_1]
Your devices will know more about you than about yourself
To learn more about technologies that can read human emotions, watch the first episode of Hot Thistle Exist ?, which discusses the impact of emerging technologies on humanity.
When we sit at a table in a cafe and ask him about his day, he often responds with a polite response like "it's okay." But when this person is lying, we can say his expression, the tone of his voice, his cramps and his muscles, because most of our conversations are not linguistic.
People read implicit meanings – unexplained evidence – to learn the truth and delve deeply into what others say to understand the meaning. Today, with a lot of discussions through electronic texts, many messages contain unspoken texts, less information than ever before.
Rana Al-Qalyubi, founding partner of Affectiva, a sentiment analysis specialist, works to improve the tools we use and restore the quality of conversations. "Much of the communication (93%) is lost in cyberspace," Rana Qalioubi said during an interview with the Quartz website.
Today's people suffer from "emotional blindness" and for this reason we find less sympathy in the world. From his point of view, the solution is not to stop using technology that robs us of our humanity, but to develop tools for truly human understanding.
– Emotion Monitoring Technology
Kalioubi society creates tools to manage the distance between language and meaning.
Technically, Rana Al-Qalioubi and her colleagues are working to collect a database of facial expressions from around the world to obtain a complete picture of human communication. To date, the team has collected 7.7 million faces from 87 countries, with five million executives. The idea here is to prove that the ability of machines to read the implicit texts we write will allow them to better meet our needs and in different situations.
> Monitor student response in online classes. Let's talk about e-learning, for example. Imagine attending a class on the Internet and feeling lost during this class. Then, the computer will understand your loss through the camera, your obsession and your perception, and alert the system to prompt the material studied to respond to your needs giving you more examples or problems less difficult to solve; In a classroom by changing activities or tactics based on the student's reaction to the subject.
If the machines can read the implicit text, they will be able to better meet our needs, and in many cases.
Analysis of autism. The use of Qaliubi's work has really started and in important areas. The autonomic analysis of feelings helps people with autism who have difficulty analyzing the emotional implications of conversations to better understand the conversation by interpreting the data of the other and giving them meaning.
A device resembling glasses can send a signal to its recipients when it lacks important unwritten evidence so as not to rely solely on speech to judge a given position.
– feelings of Internet users
> No participants in Internet seminars. Rana Al-Qalioubi also uses her own tools to measure the level of audience received during online seminars. Typically, the speaker is unable to indicate if the listeners are concentrating on what they say when they communicate with a group on the network.
But with technical assistance, he will know how much the public is attending his conference and will thus be able to convey his message more effectively. Rana Al-Qalioubi added that presenting information on the speaker's screen and examining the level of participation of the recipients according to their expressions will help to present a better presentation.
Advertisers have also used this tool to test public responses to a potential advertising campaign. While the audience watched the advertisement, Activiva tried to analyze their facial expressions. As a result, promoters have a better idea of the potential for success of their ad by determining in real time the number of non-spoken audience responses.
> Expressions and looks of motorists. Rana Al-Kalioubi also discussed the possibility of giving cars the technology to monitor the driver's eyes and facial expressions, as Kia does today with his cars. Then the car will alert the driver when he is distracted from the road. They will also be able to prevent accidents before they happen, simply by focusing on the driver's mental state and warning him when he is distracted or stunned .
Kalioubi thinks that the fields of use of his technology are innumerable. However, as she continued to work on her and improve the translation of this technique into unpublished information contained in the interviews, the researcher raised the issue of the importance of conversations in her private life and in Reflecting on the number of times people were talking, what they said and what they actually did was different.
– Technical ethics
There is no doubt that emotional technologies can also be used endlessly in dangerous uses.If this occurs irresponsibly, this tool can be used to read and analyze human feelings for the purpose of discrimination and manipulation and for exploit the data concerning them.
> Misuse. Qaliubi and his colleagues were hired after allowing their equipment for security and surveillance purposes. It should be noted that the innovative team was in good standing after rejecting lucrative offers to sign license agreements at the expense of the principles of its members. Kalioubi revealed that the company "Activiva" rejected the weekly offers to investors interested in the development of technologies of a police nature.
The expert in "emotional technology" does not hide any secrets and wants everyone to be aware of the dangers that their instruments can hide, because they think that it is necessary to think about how to develop them and how to develop them. use them, as well as their meaning for the future. Rana al-Qalioubi confidently states that this is only the beginning for his tools that will inevitably affect all people, once integrated into many devices that we use.
The technical work ethic does not only concern developers, but all those who end up using products whose effects are not understood either.
> Transparency to obtain a treasure of data. Kalioubi emphasizes above all the importance of understanding users and agreeing to provide data on their faces when using a similar tool. Companies need to be fully transparent about whether they are collecting information and for what purposes, especially since information that is now being presented in a limited way can be publicly exploited in the future. .
For example, cars that currently use the Activiva tool do not record face data. In addition, registration may allow insurance companies to use facial expressions to verify the credibility of the incident or to provide effective assistance to the police in their investigations. Remember that the uses of data multiply and multiply, but not forever.
– Collect the faces of everyone … Analyze the smile and frowning and watch the tone of irony
> Eliminate technical defects. Reductionism (reduction of all things and all aspects to a simplified state) is one of the other disadvantages of emotional analysis. A tool capable of distinguishing differences is supposed to "recognize" all kinds of faces, in all places and among countless people, to give the correct meaning.
On the other hand, algorithms based on a specific set of data are biased and able to recognize the faces you have seen frequently, which can give the machine false or unfair information.
But training the machine to read on all kinds of faces requires collecting a lot of data from many people and cultures, as well as understanding a wide range of different expressions and places.
> Smile and frown are universal expressions. Culture dominates to a certain extent the expressions that appear on our faces. Kalioubi and his colleagues think that smiles and frowning are universal expressions, but that cultural influences increase or eliminate some of their gravity. The team found, for example, that Latin Americans are more expressive than East Asians and that women around the world are more widespread than men.
> Irony – the tone is hard to watch. Al-Qalioubi also talked about what she called the Holy Grail in her field, the algorithm responsible for monitoring the irony. Although considered a kind of trivial mind, cynicism uses a tone that deliberately persuades the other party with the opposite message, a very complex type of message.
It can be said that irony is a kind of vocal wink, and when a particular tool succeeds in understanding this capitalized communication situation often accompanied by a real wink of clue. eye, this will be considered the triumph of automated learning technology. But we do not yet know how the machine will succeed in understanding this expression and clarifying it.
Two years ago, Activiva introduced the sound of its instruments, but Qalyubi did not specify the time needed to reach the Holy Grail. But she describes the tool as good and sees it as a technique that accurately analyzes the tone and expression in all cultures and personalities, but is still on the verge of complete success.
– "Quartz", "Tribune Media" Services
[ad_2]
Source link