Hey, Google, why are your subcontractors listening? – Naked security



[ad_1]

With the way your Google Home Voice Assistant records our conversations, which are sometimes triggered by mistake, audio clips – recorded on purpose or not – are sent to engineers working on Google Home voice processing.

How it is supposed to work: Google Home should only be activated when someone says that triggers "OK, Google" or "Hey, Google". But it's not difficult to accidentally reverse this option: if someone near you says "Google" or even a word that sounds like "Google", the speaker often starts recording.

The audio clips included bedroom sound symphonies, the voices of their children or grandchildren, payment information from transactions, medical information they disclose when searching for their ailments, and much more.

All this comes from a new report from the Belgian broadcaster VRT News which relied on contributions from three Google insiders.

Listen to children

With the help of an alert launcher, VRT listened to some of the clips. His reporters managed to hear enough to discern the addresses of several Dutch and Belgian using Google Home, despite the fact that some of them have never pronounced the trigger phrases to listen. A couple looked surprised and uncomfortable when the media broadcast recordings of their grandchildren.

The whistleblower who leaked the records was working as a Google outsourcer, transcribing the audio files for later use in improving speech recognition. They contacted VRT after learning how Amazon workers listen to what you say to Alexa, as Bloomberg reported in April.

They listen, but they do not necessarily delete: a few weeks ago, Amazon confirmed – in a letter responding to the request for information from a legislator – that it will forever preserve the transcripts and the records on his Alexa devices, unless a user requests that they be deleted.

The VRT has maintained with Bavo Van den Heuvel, cybersecurity expert, who has detected potential dangers for people likely to listen to our voice assistant recordings, as they may be carried out almost anywhere: in the office of a doctor, at a work meeting or with sensitive files, such as police stations, offices, etc. lawyers or the courts.

It is not just Belgian and Dutch entrepreneurs who listen to Google Home requests, although these are the only recordings listened to by the VRT. The alert launcher showed at a press briefing a platform with registrations from around the world, which means that it is likely that thousands of subcontractors are listening to wizard records. From VRT:

This employee let us examine the system in which employees must listen to the records of the Google Assistant. There must be thousands of employees around the world; In Flanders and the Netherlands, a dozen employees are likely to hear records from Dutch-speaking users.

"Anonymous" data?

Google knows very well that its subcontractors can listen to these recordings and that it is aware of the confidentiality issues that result. To prevent these subcontractors from identifying the people they are listening to, Google removes the credentials from the records.

Of course, it is common for data-hungry companies to denounce a lack of identity and equate this lack with a privacy shield. But in these days of Big Data, the demand has turned out to be flawed. After all, as we have noted in the past, individually harmless data points can be extremely powerful and revealing when aggregated. It's actually the essence of Big Data.

Take, for example, the research done by MIT graduate students a few years ago to see how easy it would be to re-identify people from three-month credit card data from a newspaper. anonymized transactions.

The result: with 10 known transactions – quite easy to accumulate if you take coffee from the same store every morning, park at the same lot every day and pick up your newspaper in the same newsstand – the researchers discovered they had more than 80% chance to identify you.