[ad_1]
Apple suddenly got trapped in a security and privacy nightmare, just as the iPhone 13 hit the streets. This now threatens to damage the next 12 months leading up to the iPhone 14 and is starting to look like an unresolvable problem for Apple.
Preparing for this week’s Apple iPhone and iOS launches has been a disaster by their standards. Say again security warnings as Israel’s NSO allegedly exploited multiple vulnerabilities added to its inconvenient and ill-conceived idea to run on-device filtering for known child abuse images and explicit photos from iMessage.
Apple remains committed to catching up with its peers and adding some form of CSAM filtering to its cloud platform, but given the backlash, has “decided to take more time over the next few months to gather feedback and make improvements before it.” to release these extremely important child safety features. “
The sharpest criticism of Apple’s plans was the risk that it would bow to demands by US and foreign governments to extend the reach of control over devices, that compliance with “local laws” force its hand. Apple’s defense was to make sure it would never comply with such demands, that it had safeguards in place. Unfortunately, just around the time of the iPhone 13’s launch, those assurances were very publicly refuted.
It’s not China this time, where Apple has struggled continuously to push back the pressure of the government, it is rather Russia. Apple has a habit of removing apps that displease Chinese officials, and now it has done the same with voting apps that threaten to undermine the status quo in Russia.
As reported speak New York Times, “Apple and Google have removed an app intended to coordinate the protest vote in the country’s Russian elections this weekend on Friday … The decisions came after Russian authorities, who claim the app is illegal, threatened to sue local employees of Apple and Google. “
For Apple, this couldn’t have come at a worse time. As the Times noted, what happened in Russia was “a demonstration of the limits of Silicon Valley when it comes to resisting the crackdown on dissent around the world.” Completely undermining the CSAM reach creep assurances that Apple has pushed so hard in response to the recent backlash.
Turns out Apple couldn’t ignore the self-made mantra of what happens on your iPhone stays on your iPhone. But Apple is also awkwardly boxed. The company argued that filtering photos in the cloud as performed by Google and Microsoft and others is an invasion of privacy: “Existing techniques as implemented by other companies” Apple said, “Analyze all user photos stored in the cloud. This creates a risk for the privacy of all users.
Apple used this risk to push its alternative onto the device, “providing significant privacy benefits by preventing Apple from learning about photos unless they both match known CSAM images and are included. in an iCloud Photos account that includes a collection of known CSAMs “. With hindsight, he should have made room for himself.
As I said before, reintroducing filtering to the device will be impossible without repeated backlash. Apple has already explained in detail the security and privacy measures protecting its approach, which didn’t help last time. We don’t know what else he could do, other than water down the measures to make them powerless.
If Apple had any doubts as to the strength of its feelings, then the well reported protests outside of Apple stores will quickly have changed that. “Organized by Fight for the Future, the Electronic Frontier Foundation and a network of volunteers, the protests demand that Apple permanently drop its dangerous proposal to install photo and message scanning malware on millions of devices.
What Apple should have done is quietly add some form of CSAM photo filtering to iCloud, explaining that this has become the industry standard, but with some Apple-style privacy innovations to make it his system optically better than the others.
The only problem here would be any end-to-end iCloud photo encryption project, and some have argued that its approach is prepared for it. But if that’s the case, then any client-side filtering of end-to-end encrypted data would undermine that encryption anyway, as we’ve seen with the iMessage explicit imagery; and so it wouldn’t work either.
Apple has gone to so much trouble to explain why cloud filtering is a bad idea from a privacy standpoint, that it is now difficult for it to go back and opt for this solution without sounding hypocritical or without add enough intelligence to escape its own trap. Whichever way you look at it, filtering users’ photos in the cloud is a privacy risk, although users have implicitly accepted it for CSAM reasons without backlash.
But here Apple has also created a new problem for itself and for others. The online debate that his CSAM movements have fueled has led to more details about these filtering measures than has been openly in the public domain before. Aside from the risk that bad actors without the technological knowledge to explore the darkest corners of the web now know more than they do, the larger problem is that the public can no longer dismiss this as robotic AI.
Apple’s confirmation that human reviewers would rule out false positives highlighted the very concept of these false positives. “Whenever an account is identified by the system, Apple does a human examination before making a report … So now we have questions for other platforms about the number of false positives triggered and excluded by reviewers. How private are cloud photo platforms after all?
Apple is clearly under pressure to step up its efforts to fight CSAM, having fallen behind its peers with the measures it has put in place. And notwithstanding the issues he raised regarding false positives and human criticism, he must do Something.
The reality is that on-device testing will be an impossible sale. Apple can of course ignore the protests and hope they will eventually go away, but the iPhone’s privacy integrity will have been violated, and it’s a one-way street. The company must find words to sell cloud filtering that are not triggered by its past posts.
And it has to happen fairly quickly. Apple certainly needs this problem to be resolved before next fall, because if it considers this to be a cornerstone of an operating system update, and therefore passes now at iOS 16, it won’t want its next launch damaged. If the alternative is a mid-life update to iOS 15, it would be the most controversial update in memory, and users are likely to hold back.
Alternatively, if Apple opts for an iCloud update, an update that doesn’t require a device update and works the same as its peers, then Apple needs to be reassured about the details behind that, and how he can reverse his previous warnings. For now, Apple says nothing more than its delaying statement.
It’s a nightmare for Apple. Its backlash followed by a rollback to CSAM followed the terms of WhatsApp on Facebook, then Google on FLoC – a terrible privacy-centric lens for Apple. Whatever happens, as soon as Apple returns with its revised proposals, Russian political withdrawal will be important in any response. How Apple resolves this seemingly impossible problem is crucial.
[ad_2]
Source link