[ad_1]
Apple said today it will deny any government request to expand its new photo scanning technology beyond the current plan to use it only to detect CSAM (child sexual abuse material).
Apple has faced days of criticism from security experts, privacy advocates and privacy-conscious users over the plan it announced on Thursday, in which iPhones and other devices Apple would scan the photos before uploading them to iCloud. Many critics have pointed out that once the technology is installed on consumer devices, it won’t be difficult for Apple to expand it beyond CSAM detection in response to government requests for broader surveillance. We covered how the program works in detail in an article Thursday night.
For years, governments have pressured Apple to install back doors in its end-to-end encryption system, and Apple has recognized that governments are likely to make the exact demands that security experts and privacy advocates have warned. In an FAQ posted today with the headline “Extended Protections for Children,” there is a question that asks, “Could governments force Apple to add non-CSAM images to the hash list?” “
Apple answers the question as follows:
Apple will refuse such requests. Apple’s CSAM detection capability is designed only to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC (National Center for Missing and Exploited Children) and other child safety groups. We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to extend it. Additionally, Apple performs a human examination before reporting to NCMEC. In the event that the system reports photos that do not match the known CSAM images, the account will not be deactivated and no report will be filed with NCMEC.
None of this means that Apple doesn’t have the ability to expand the uses of technology, of course. Responding to whether its photo scanning system can be used to detect things other than CSAM, Apple said it “is designed to prevent this from happening.”
“CSAM detection for iCloud photos is designed so that the system will only work with CSAM image hashes provided by NCMEC and other child safety organizations,” Apple said. “There is no automated reporting to law enforcement, and Apple does a human review before reporting to NCMEC. Therefore, the system is only designed to report known CSAM photos in iCloud Photos. ”
Apple says it won’t inject more photos into the database
But the current design of the system does not prevent it from being redesigned and used for other purposes in the future. The new photo scanning technology itself is a major change for a company that has used privacy as a selling point for years and calls privacy a “basic human right.”
Apple said the new system will roll out later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, and will initially be rolled out in the United States only. The current plan is for Apple devices to scan user photos and flag those that match a database of known CSAM image hashes. The Apple FAQ implicitly acknowledges that hashes of other image types could be added to the list, but the document states that Apple will not.
“Can non-CSAM images be ‘injected’ into the system to flag accounts for things other than CSAM? Our process is designed to prevent this from happening,” Apple wrote. “The set of image hashes used for correspondence are from known and existing images from CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of image hashes. Known CSAMs. “
Apple also said the new “feature only affects users who have chosen to use iCloud Photos to store their photos. It does not affect users who have not chosen to use iCloud Photos.” Apple’s FAQ didn’t say how many people are using iCloud Photos, but it’s a widely used feature. There are over a billion actively used iPhones around the world, and a 2018 estimate from analysts at Barclays found that iCloud (including all services, not just iCloud Photos) has 850 million users.
Apple memo calls privacy advocates “screaming voices”
Apple doesn’t seem to have anticipated the level of criticism its decision to scan users’ photos would receive. On Thursday evening, Apple distributed an internal memo that acknowledged the critics but called them “the screaming voices of the minority.”
This part of the note was written by NCMEC’s Executive Director of Strategic Partnerships, Marita Rodriguez. “I know it’s been a long day and many of you probably haven’t slept in 24 hours. We know the days to come will be filled with the shrill voices of the minority. Our voices will be louder. Our commitment to lift the children who have experienced the most unimaginable abuse and victimization will be stronger, ”Rodriguez wrote.
The memo was obtained and posted by 9to5Mac. The portion of the memo written by Apple said, “We have seen a lot of positive responses today. We know some people have misunderstandings, and more than one worry about the implications, but we’ll continue to explain and detail the features so people understand what we’ve built. ”
Open letter warns of expanding uses of surveillance
More than 6,000 people signed an open letter urging Apple to change course, saying that “Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates to ensure that strong action is taken. preservation of privacy are standard in the majority of consumer electronics devices and use cases. “
The letter cited several security experts, including researcher Nadim Kobeissi, who wrote, “Reminder: Apple sells iPhones without FaceTime in Saudi Arabia because local regulations prohibit encrypted phone calls. This is just one example of many where Apple bowed to local pressure. What happens when local regulations require posts to be scanned for homosexuality? “
The letter also quotes Matthew Green, professor of cryptography at Johns Hopkins University, who told Wired: “The pressure is going to come from the UK, US, India, China. I’m terrified. Why would Apple want to tell the world, “Hey, we have this tool?”
[ad_2]
Source link