Amazon Amazon voice counselor advised a boy to kill named parents »Tokar.ua



[ad_1]

Amazon Alexa is a very popular voice assistant in the United States. And thanks to the efforts of programmers, it's hard not to give in to the person to communicate with this assistant is as natural as possible.

Sometimes it goes too far: in some cases, Alexa was talking about sex and other obscene things (regardless of the age of the interlocutor). also revealed confidential information about other people in the middle of a conversation.

This time, seeking advice, the client learned that he should kill his foster parents.

Despite the fact that Amazon representatives is denied I'm talking about the specific cases of strangulation in the behavior of the virtual assistant, who recognize that such errors occur.

What's wrong with Amazon Alexa?

The problem lies in training a voice assistant – Alexa uses machine learning. The program manages huge amounts of data from various sources to understand how human language is built into communication.

Recently, the developers came to the conclusion that their voice assistant better understood the commands and answered commands if they were allowed to read the comments of the forum Reddit . .

Only here, the problem was that Alexa started to be rude, because the users of this forum did not differ in their exemplary behavior. The developers have tried to protect him from reading "dangerous" branches with comments. But that did not stop the assistant from reading a masturbation note once and " How to do it better " .

Amazon has already developed a new ". . " which should avoid such situations in the future. But do not forget that such filters are filters of the true nature of people.

Because Alexa did not invent and write bad things in the comments – that's what we did with you.

↑ Lick us to read us on Facebook.

[ad_2]
Source link