WhatsApp says it won’t scan your photos for child abuse



[ad_1]

Image of article titled WhatsApp says it will not scan your photos for child abuse

Photo: Carl Court / Staff (Getty Images)

Apple’s new tool to discover potential for child abuse in iPhone photos is already causing controversy. Just a day after its announcement on Friday, Will Cathcart, the manager of Facebook’s messaging app, WhatsApp, said the company would refuse to adopt the software on the grounds that it introduced a host of legal and privacy concerns. .

“I read the information published by Apple yesterday and I am worried. I think this is the wrong approach and a setback for the privacy of people all over the world, ”Cathcart tweeted. “People asked if we would adopt this system for WhatsApp. The answer is no.”

In a series of tweets, Cathcart expanded on these concerns, citing the ability of spyware company governments to co-opt software and the potential of unverified software to invade privacy.

“Can this scanning software running on your phone be error-proof?” The researchers were not allowed to find out, ”he wrote. “Why not? How will we know how often mistakes violate people’s privacy?”

In its software announcement Thursday, Apple said it planned the update for a late 2021 release as part of a series of changes the company plans to implement to protect children from sexual predators. As Gizmodo Previously reported, the proposed tool — which would use a “neural matching functionNeuralHash called to determine if images on a user’s device match known fingerprints of child pornography material (CSAM) – has already caused some consternation among security experts.

In the month of August 4 discussion threads, Matthew Green, Associate Professor at Johns Hopkins Institute for Information Security, warned that the tool could eventually become a precursor to “add surveillance to encrypted messaging systems.”

“I have had independent confirmation from several people that Apple is releasing a client-side tool for CSAM analysis tomorrow. This is a very bad idea, ”said Green. tweeted. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash and report them to Apple’s servers if too many photos appear.”

But according to Apple, Cathcart’s characterization of the software as being used to “digitize” devices is not entirely accurate. While scanning does imply an outcome, the company said, the new software would simply run a comparison of all the images a given user would choose to upload to iCloud using the NeuralHash tool. The results of this scan would be contained in a cryptographic security voucher – essentially a bag of interpretable data bits on the device – and the contents of that voucher would have to be sent for reading. In other words, Apple would not collect any data from individual user photo libraries as a result of such an analysis, unless they amass treasure troves of child sexual abuse (CSAM) material.

According to Apple, while there is a risk of misreading, the rate of falsely sent users for manual review is less than one in 1. trillion a year.



[ad_2]

Source link