Apple to start matching iPhone and iCloud photos for child abuse images



[ad_1]

The new service will turn photos on devices into an unreadable set of hashes – or complex numbers – stored on users’ devices, the company explained at a press conference. These numbers will be compared to a hash database provided by the National Center for Missing and Exploited Children.

By taking this step, Apple (AAPL) follows other big tech companies such as Google (GOOG) and Facebook (FB). But it also tries to strike a balance between security and privacy, with the latter being seen by Apple as a central selling point for its devices.

Some privacy advocates were quick to voice concerns about the effort.

“Apple is replacing its industry standard end-to-end encrypted messaging system with a surveillance and censorship infrastructure, which will be vulnerable to abuse and drift not only in the United States, but around the world,” he said. said Greg Nojeim, co-director of the Security & Surveillance project at the Center for Democracy & Technology. “Apple should abandon these changes and restore the confidence of its users in the security and integrity of their data on Apple devices and services.”

In an article on its website describing the updates, the company said, “Apple’s method… is designed with user privacy in mind.” Apple stressed that the tool does not “scan” user photos and that only images from the database will be included as matches. (This should mean that the harmless image of a user of their child in the tub will not be reported.)

Apple also said that a device would create a double-encrypted “security voucher” – a packet of information sent to servers – which is encoded in the photos. Once there are a number of reported security vouchers, Apple’s review team will be alerted. It will then decrypt the voucher, deactivate the user’s account, and alert the National Center for Missing and Exploited Children, which can notify law enforcement. Those who believe their accounts have been reported in error can file an appeal to restore it.

Apple’s goal is to ensure that identical and visually similar images produce the same hash, even if they have been slightly cropped, resized, or converted from color to black and white.

“The reality is that the privacy and protection of children can coexist,” said John Clark, president and CEO of the National Center for Missing & Exploited Children, in a statement. “We applaud Apple and look forward to working together to make this world a safer place for children.”

The announcement is part of a larger push around child safety by the company. Apple said Thursday that a new communications tool will also alert users under the age of 18 when they are about to send or receive a message with an explicit image. The tool, which must be enabled in Family Sharing, uses machine learning on the device to analyze image attachments and determine if a photo is sexually explicit. Parents of children under 13 can also activate a notification feature in the event that a child is about to send or receive a nude image. Apple said it would not have access to the messages.

This tool will be available as a future software update, according to the company.

[ad_2]

Source link