Amazon keeps your Alexa data as text even AFTER removing audio recordings



[ad_1]

Voice recordings captured by Alexa from Amazon may be deleted, but automatically generated transcripts remain in the company's cloud, according to reports.

After Alexa has heard her word "wake up", the intelligent assistant starts listening and transcribing everything that he hears.

All voice commands addressed to the virtual assistant can be deleted from the central system, but the company still has text logs.

These data are kept on its cloud servers, with no possibility for users to delete them, but the company claims that they are work on ways to make data inaccessible.

Scroll for the video

Amazon workers listen to private and sometimes disturbing voice recordings to enhance the understanding of human speech by voice assistants. Team staff members review up to 1,000 clips that annotate and transcribe them

This new finding comes as many reports have highlighted the treatment of sensitive data by society, users taking a closer look at the technology they use more than ever before.

"When a customer deletes a voice recording, we also remove the corresponding text transcript associated with their account from our main Alexa systems and many subsystems, and we are currently working to remove it from the remaining subsystems," he said. said an Amazon spokesperson at CNET. .

The report follows yesterday's revelations that more than a dozen consumer groups are considering filing a lawsuit against the company with the Federal Trade Commission.

They alleged thatThe technology giant is violating federal laws by failing to obtain parental consent before collecting data on children through Echo devices.

According to the Wall Street Journal, the lawsuit was filed against Amazon. Amazon does not disclose how it collects and uses data about children.

The groups claim that the company violates the children's privacy after discovering that they share their names, personal addresses, social security numbers and other sensitive information with Alexa, often with minimal parental oversight.

Amazon met with the skepticism of some privacy advocates and congressmen last year when he presented his first vocal assistant specifically designed for kids.

The device allows children to ask Alexa questions, ask her to play music and to remember information.

This gives them access to "FreeTime Unlimited", an Amazon service offering 10,000 types of child-friendly content, such as videos, books, TV shows, and movies.

"These are kids who talk about everything and nothing in their own home," said Josh Golin, who runs the Campaign for a Free and Commercial Childhood.

"Why does Amazon keep these voice recordings?

A coalition of groups led by the Golin organization and the Georgetown University Institute of Public Representation is currently filing a complaint with the FTC alleging that Amazon is violating the federal law on protection of children's privacy online.

MailOnline has contacted Amazon for a comment.

A previous survey conducted by MailOnline found that smart assistants listened to roommates' gossip, private conversations about insurance policies – and even the family dog.

It is possible to remove your system from any voice commands addressed to your virtual assistant, but the company always has the text logs of the transcribed audio. These data are kept on its cloud servers, with no possibility for users to delete them, but the company says that it is working on ways to make the data inaccessible.

It is possible to remove your system from any voice commands addressed to your virtual assistant, but the company always has the text logs of the transcribed audio. These data are kept on its cloud servers, with no possibility for users to delete them, but the company says that it is working on ways to make the data inaccessible.

Amazon insists that Alexa can only be activated when the assigned "wake-up word" is pronounced – Alexa, Computer or Echo.

The tech giant – with Apple's Siri and, until recently, Google's assistant – said that he was recording every person's interactions with the device in order to improve service – from "unintentional" snippets being also recorded if another sound was taken for "wakeup word".

However, the evidence seen by MailOnline shows that this can not be the case, or the process is fundamentally flawed because a host of sounds and conversations have been recorded without a clear or legitimate wakeup word being spoken – some cases, even without humans.

A MailOnline survey on these "secret" archives revealed strange snippets of friends, families and children from registered users while they were completely unaware – and without a word from them. clear or legitimate alarm is not pronounced.

Amazon sent a comment via a spokesperson regarding the privacy settings of his voice recordings as part of this investigation.

The company says customers have "total control" of their records and can delete them at any time.

While this is true to the extent that users can erase their oral history quite easily, this assumes that they know the existence of the archive in the first place.

It remains to be seen what real advantage Amazon has in preserving this data.

The statement reads: "Alexa is becoming smarter, which is only possible if it is trained in voice recordings to better understand requests, provide more accurate answers, and personalize the customer experience.

"Alexa's training with voice recordings from a wide range of clients allows Alexa to work for everyone.

"Customers have full control over the voice recordings associated with their Alexa account. They can view, listen to and delete voice recordings one by one or all at once by visiting the Alexa app or https://www.amazon.com/alexaprivacy. & # 39;

WHY DO WE CONCERN THE CONFIDENTIALITY WITH THE ALEXA APPARATUS OF AMAZON?

Amazon devices have already been enabled when they are not desired, which means that the devices can be listening.

Millions of people are reluctant to invite devices and their powerful microphones home, fearing that their conversations will be heard.

Amazon devices rely on microphones that listen to a keyword, which can be triggered by accident and without the owner 's knowledge.

The Echo Spot camera at £ 119.99 ($ ​​129), which also serves as an intelligent alarm, will probably also be directed directly to the user's bed.

The device has sophisticated microphones that allow people to talk across the room, even if music is going on.

Last month, a British security researcher, Mark Barnes, hacked the 2015 and 2016 versions of the Echo turned into a live microphone.

Fraudsters could then use this live audio stream to collect sensitive information from the device.

[ad_2]

Source link