Why you should delete Google Photos on your iPhone, iPad and Mac



[ad_1]

When it comes to cloud photo storage, Google Photos leads the pack: four trillion photos and videos for over one billion users. Millions of Apple users have Google Photos on their iPhones, iPads, and Macs, but Apple has just flagged a serious warning regarding Google’s platform and has given its users a reason to remove their apps.

It’s been a dreadful few weeks for Apple on the privacy front – not what the iPhone maker needs as the iPhone 13 and iOS 15 launch approaches. A week ago, the company awkwardly (although inevitably) reverted to its ill-conceived plan to filter photos of its users on their devices to eliminate known images of child abuse.

MORE FORBESApple is going back on scanning photos on iPhone – for now

Screening for CSAM is not controversial. All major cloud platforms—including Google Photos– have been doing it for years. “Child sexual abuse material has no place on our platforms,” Google told me. ” As we have describe, we use a range of industry standard analysis techniques including hash matching technology and artificial intelligence to identify and remove CSAM that has been uploaded to our servers.

But Apple, it turns out, has not do the same. The company has yet to apply such filtering to iCloud Photos, and its reasoning for the seemingly surprising move once again highlights the different privacy philosophies at play.

The Apple controversy (now blockedThe decision to search for CSAM on the device rather than in the cloud was, according to the company, because it wanted to report known images “while not learning any information about non-CSAM images.” This means that not all users should waive the privacy of all their content, to point out a tiny minority.

The principle itself is quite solid. If your private iPhone isn’t reporting any potential CSAM matches, Apple’s servers may ignore all of your content. If your iPhone Is report potential matches, at least 30, the server will know exactly where to look.

The problem, however, is that despite detailed technical explanations and assurances, this concept of on-device screening has not worked well. This “private iPhone” filtering simply appeared as spyware on the device, raising the specter of a range drift, with more and more content being flagged at the behest of US and foreign governments. And so, Apple has folded back on its drawing board to rethink.

But turn the tide, and there’s an interesting conundrum for the rest of the industry. Apple has highlighted the invasion of privacy by searching all your photos in the cloud, that a simple match with CSAM databases would be welcome, but does it end there? And what about the risks inherent in Apple’s technical detail, around mock matches and manual revisions? Does this mean that our cloud photos on other platforms are regularly flagged and reviewed by the offices of manual operators?

Worse yet, the real issue that stuck Apple’s CSAM plans below the waterline was the risk of governments extending beyond known CSAM content, collected by child safety organizations, to others. contents. Political or religious dissent, other crimes, persecuted minorities in parts of the world where Apple sells its devices.

Apple has explained in detail that it has technical safeguards in place to hinder this, promising it will always say no. He then said that it was only the United States to begin with and that it would only expand to countries where these risks could be contained. But the agitated privacy lobby was uncertain, especially given Apple’s past challenges to ‘just say no’ in China, over iCloud vaults and app censorship, for example. .

Obviously, you don’t have to be a technical genius to understand that these same risks apply to cloud filtering and aren’t limited to software on devices. Yes, the jurisdiction in which cloud data is stored varies, but major tech must always abide by local laws, as is often clearly stated, and the defense that it is not technically possible, which is used to defend the message encryption as an example, can not apply.

And so, to Google Photos. There are three reasons why Apple users should remove these apps. First, using Google Photos means giving the platform full access to your photos. It’s all or nothing. Apple has a relatively new privacy preservation tool in its photos app, to limit photos that all apps can access. But Google Photos won’t accept it, insisting that you change the setting to give it access to everything if you want to use the app.

Second, the Google Photos privacy label is a horror show compared to Apple’s alternative. Just like with other stock apps, Google (like Facebook) collects what it can, excusing this by saying it only uses the data when necessary. But the problem is, Google ties all of this data to your identity, on top of the vast profiles associated with your Google Account or other personal identifiers. Google doesn’t do this as a service, it’s at the heart of its data-driven advertising business model. Just follow the money.

Google says these labels “show all the possible data that could be collected, but the actual data depends on the specific features you decide to use … We will collect contact information if you want to share your photos and videos … or if you decide to purchase a photo book, we collect your payment information and store your purchase history, but this data would not be collected if you chose not to share photos or make a purchase.

Google, like Facebook, will also harvest metadata from photos and extract them into its algorithm-based slot machine. “We use EXIF ​​location data to improve the user experience in the app,” the company told me. “For example, to highlight a trip in our Memories feature or to suggest a photo book from a recent trip. “

Obviously, you can each have an opinion on what personal data you’re comfortable with being pulled from Google’s datasets to be pulled and analyzed, and Google now offers more controls than ever to narrow down what is shared. But limiting Google’s access also limits its functionality. It’s that basic philosophy at play.

“Your photo and video albums are full of precious moments,” Apple retorts as Google approaches. “Apple devices are designed to give you control over those memories.” And at the heart of that confidence, we have the same device versus cloud debate that framed the CSAM controversy that hit Apple last month.

Which leads to the third problem. We already know that Google applies cloud AI to the photos it stores. Behind Apple’s CSAM movement was its well-established approach to analyzing your device’s data. Apple uses ML on the device to categorize photos, for example, enabling intelligent search for objects or people. Google does it in the cloud. And where Apple’s CSAM issue tied this ML on the device to external processing, Google’s cloud ML is already external, off-device, a relative black box for users.

When Apple says its Photos platform “is designed for facial recognition and scene and object detection, which power features like For You, Memories, Sharing Suggestions, and the People album, to happen on device rather than in the cloud … And when apps request access to your photos, you can share only the images you want, not your entire library, ”we know exactly who they’re thinking about .

When approaching CSAM in Google Photos, the company told me that we work closely with the National Center for Missing and Exploited Children and other agencies around the world to tackle this type of abuse.

But Google wouldn’t be attracted to my other questions: privacy protections in Google Photos, limitations and restrictions on filtering, its policy on government requests (foreign or domestic), if it had been asked to expand the scope of its filtering — other than pointing me to its general content advertising policies (not metadata, you’ll notice), and its transparency report.

Google also hasn’t commented on what other AI classifiers it applies to Google Photos, how data is collected and used, and whether it intends to revise anything in light of the backlash. Apple. There’s no implication that Google is doing anything other than the obvious, but that’s the problem with the cloud, it’s really just someone else’s computer.

Just like we exposed Facebook to harvest EXIF ​​data without any user transparency, the problem is digging under the terms and conditions to understand what that actually means to you. And when the scan takes place off-device, it’s completely invisible to you, unless they choose to share. That was sort of Apple’s take on CSAM.

Is there a risk here? Yes of course. Apple told you. We know that Google adopts a much less privacy-friendly architecture than Apple in any case. And so, you should engage with its apps and platforms with your eyes wide open.

Meanwhile, if you’ve spent over $ 1000 on your iPhone, my recommendation is to use the privacy measures it has in place. And that means ignoring Google Photos despite the advanced search features it can have. As always, convenience comes at a price; absent full transparency and controls, this price remains too heavy to pay.

[ad_2]

Source link