Apple to reject requests to use CSAM system for surveillance



[ad_1]

Steve Proehl | Corbis Unpublished | Getty Images

Apple on Monday defended its new system to scan iCloud for illegal child sexual abuse (CSAM) material amid an ongoing controversy over whether the system is reducing the privacy of Apple users and could be used by governments to monitor citizens.

Apple last week announced that it began testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It says it can do this without knowing the content of a user’s photos stored on its servers.

Apple reiterated Monday that its system is more private than those used by companies like Google and Microsoft because its system uses both its servers and its software running on iPhone.

Privacy advocates and tech commentators fear that Apple’s new system, which includes software that will be installed on people’s iPhones via an iOS update, could be extended in some countries by new laws to verify. other types of images, such as photos with political content, instead of child pornography only.

Apple said in a document posted to its website on Sunday that governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known images of child abuse that Apple said. ‘Apple will distribute to iPhones to activate the system.

“Apple will refuse such requests. Apple’s CSAM detection capability is designed only to detect known CSAM images stored in iCloud Photos that have been identified by experts from NCMEC and other child safety groups,” said Apple in the document. “We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future.”

He continued, “Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to expand it.”

Some cryptographers are worried on what might happen if a country like China passed a law stating that the system must also include politically sensitive images. Apple CEO Tim Cook has previously said the company follows the laws of all countries where it does business.

Businesses in the United States are required to report CSAM to the National Center for Missing & Exploited Children and face fines of up to $ 300,000 when they discover illegal footage and fail to report it.

A reputation for confidentiality

Apple’s reputation for protecting privacy has been cultivated for years through its actions and marketing. In 2016, Apple took on the FBI in court to protect the integrity of its on-device encryption systems as part of a mass shooter investigation.

But Apple has also faced significant pressure from law enforcement officials regarding the possibility that criminals will “obscure” themselves or use privacy and encryption tools to prevent messages or others. information to be within the reach of the police.

Controversy over Apple’s new system, and whether it monitors users, threatens Apple’s public reputation for building secure, private devices, which the company has used to break into new markets in personal finance and health care.

Critics fear that the system will partially work on an iPhone, instead of scanning only photos that have been uploaded to the company’s servers. Apple’s competitors typically only scan photos stored on their servers.

“It’s really disappointing that Apple has held onto its particular view of privacy so much that it has ended up betraying the fulcrum of user control: being able to trust that your device is truly yours,” said wrote tech commentator Ben Thompson in a newsletter Monday.

Apple continues to champion its systems as a real improvement that protects children and will reduce the amount of CSAM created while protecting the privacy of iPhone users.

Apple said its system was significantly stronger and more private than previous systems under all privacy measures followed by the company and that it had done everything possible to create a better system to detect these illegal images.

Unlike current systems, which run in the cloud and cannot be inspected by security researchers, Apple’s system can be inspected through its distribution in iOS, an Apple representative said. By moving certain processing to the user’s device, the business can achieve stronger privacy properties, such as being able to find CSAM matches without running software on Apple servers that verify each photo.

Apple said on Monday that its system does not scan private photo libraries that have not been uploaded to iCloud.

Apple has also confirmed that it will process photos that have already been uploaded to iCloud. The changes will be rolled out via an iPhone update later this year, after which users will be notified that Apple is starting to check photo stores on iCloud against a list of fingerprints that match known CSAMs. , Apple said.



[ad_2]

Source link