Apple Responds to Rising Alarm Over iPhone Photo Scan Feature



[ad_1]

Apple has responded to growing alarm over its new iPhone scan feature by privacy experts and competitors.

Last week, the company announced that it would be rolling out new tools that can browse files on a user’s phone and check if they included child sexual abuse material or CSAM.

Apple said the feature was designed with privacy in mind and the actual scanning takes place on a person’s iPhone rather than Apple’s systems. The company will only be able to see the photos if they are similar enough to an existing database of child sexual abuse images, he said.

Despite these assurances, the new feature has come under heavy criticism from privacy and security activists, who say it could weaken fundamental iPhone protections and be misused to attack users. innocent.

Critics have suggested, for example, that governments could force Apple to add other types of images to its database, so that the tool can be used by despotic regimes to hunt down dissidents, for example. Other fears include the possibility of the system going awry and flagging other types of images for Apple review, essentially removing the privacy of entirely innocent images.

Apple responded to these criticisms in a new frequently asked questions document posted on its website under the name “Extended Protections for Children.”

In the introduction to this document, he acknowledged that while the features had received support from some organizations, others had “asked questions”.

It first seeks to answer questions about the tool known as “Message Communications Security,” which scans photos sent to children for signs of abuse. He notes that Apple never has access to these communications, that end-to-end encryption is always provided, and children will be notified before information is shared with their parents.

He then discusses the more controversial feature, known as “CSAM detection”. In this section, Apple makes a number of commitments designed to allay concerns about the new feature.

It states that Apple will not scan all photos, but only those that have been uploaded to iCloud Photos, suggesting that all phones with this feature disabled will be exempt. Apple had not explicitly said before that there would be a way to turn off this scanning feature.

Apple also pledges that it only designed the system to detect images of child sexual abuse, apparently in response to concerns that the scope of the feature might be expanded in the future.

He says that if he is asked to add other types of images to his database, he will “refuse such requests.”

“We have already faced requests to create and deploy government-imposed changes that degrade user privacy, and we have firmly refused those requests,” said Apple. “We will continue to refuse them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to extend it.

He also denies that there is a way to do this without Apple’s help, by “injecting” other types of images into the database to be flagged. It states that Apple is unable to add new images to this database, which comes from child safety organizations, and that since the database is the same for everyone , it would not be possible to modify it for a specific user.

It also indicates that “there is no automated report to the police”, and therefore any report transmitted to the authorities would first be seen by Apple. “In the unlikely event that the system reports images that do not match the known CSAM images, the account will not be deactivated and no report will be filed with NCMEC,” he said.

[ad_2]

Source link