Apple staff listen with Siri and hear sensitive recordings



[ad_1]

Apple employees listen to Siri recordings, regularly listening to sensitive medical information, drug cases and sexual intercourse, reports The Guardian. Apple allows employees to listen to certain Siri recordings, which allows them to improve speech recognition.

Earlier this month, it appeared that Google also allowed employees to listen to his Google assistant. Amazon does the same with the recordings of her assistant Alexa. There was great indignation with regard to these companies because the users were not informed of possible registrations.

"Less than 1% of questions"

Nowhere does Apple tell customers that Siri recordings can be listened to by people, even if the wizard is inadvertently activated. "A small proportion of Siri issues are analyzed to improve Siri and the dictation function," Apple said in an answer.

"User questions are unrelated to their Apple ID, Siri's responses are scanned safely, and all reviewers must comply with Apple's strict privacy requirements." Apple says less 1% of Siri's daily activations are listened to by employees. Records are random, usually for a few seconds.

"Involuntary activation too easy

The anonymous source of The Guardian rang the bell, because Siri can often be triggered accidentally. This happens when people hold a button down for too long on their iPhone or talk, for example, of a "dice set"; Siri thinks to hear the word activation "Hey Siri".

Apple Watch users can even activate Siri by holding their watch in the mouth, a move that can also be misinterpreted by the watch. According to the anonymous source, Siri is often inadvertently activated.

Google and Amazon offer users the option of not allowing the use of their voice recordings for analysis. However, this happens by default and users are barely informed. However, with Siri, Apple does not offer users the option to opt out of analyzing their data.

[ad_2]
Source link