Apple VP Acknowledges Concerns Over New Scan Feature In Internal Memo



[ad_1]

Apple’s next feature that will scan iOS devices for images of child abuse is an “important mission,” a company vice president of software wrote in an internal memo. First reported by 9to5 Mac, the note from Sebastian Marineau-Mes acknowledges that the new protections worry some people, but that the company “will maintain Apple’s deep commitment to user privacy.”

As part of its extended protections for children, Apple plans to scan images on iPhones and other devices before uploading them to iCloud. If they find an image that matches one in the National Center for Missing and Exploited Children (NCMEC) database, a human at Apple will examine the image to confirm if it contains child pornography. If confirmed, NCMEC will be notified and the user’s account will be deactivated.

The announcement sparked concerns among privacy advocates who questioned how Apple could prevent the system from being exploited by bad actors. The Electronic Frontier Foundation said in a statement that “it is impossible to create a client-side scanning system that can only be used for sexually explicit images sent or received by children” and that the system, as well-meaning be it, “will break key promises.” encryption of the messenger itself and open the door to wider abuses.

According to 9to5MacMarineau-Mes wrote in the memo that the project involved “a deep cross-functional engagement” across the company that “provides tools to protect children, but also maintains Apple’s deep commitment to user privacy.” .

Apple did not immediately respond to a request for comment on Friday.

[ad_2]

Source link