Update raises doubts about Apple’s privacy … and criticism of WhatsApp manager



[ad_1]

Apple last week unveiled a system that will allow it to report child abuse photos uploaded to iCloud storage in the United States and report them to authorities.

The company received praise from child protection advocates for the move. In a statement, John Clark, CEO of the National Center for Missing and Exploited Children – a non-profit organization created by mandate of Congress – called him a “game changer.”

But the new system, which is currently being tested in the United States, has been fiercely opposed by privacy advocates who warn it represents a slippery slope and could be further tweaked and exploited to censor other types of content on. personal devices.

And Apple is not unique in its efforts to get rid of the cloud storage of illegal child pornography. Where other cloud services do the same. Google has been using hashing technology since 2008 to identify illegal images in its services.

Facebook also said in 2019 that it removed 11.6 million images of content related to nudity and child sexual exploitation in just 3 months.

Apple said its system is an improvement over industry standard methods because it uses its hardware control and sophisticated math to learn as little as possible about photos on a person’s phone or cloud account. while reporting illegal child pornography on its cloud servers.

But privacy advocates see the move as the start of a policy shift in which Apple could be pressured by foreign governments, for example, to reuse the system to suppress political discourse by forcing Apple to tag images of demonstrations or political tweets.

Skeptics don’t worry about how the system works today, nor do they stand up for people who collect known images of child exploitation. On the contrary, they are worried about how it might develop in the years to come.

“Make no mistake: if they can search for child pornography today, they can search anything tomorrow,” NSA whistleblower Edward Snowden wrote in a tweet.

The Electronic Frontier Foundation (EFF), which has supported Apple’s policies on encryption and privacy in the past, criticized the move in a blog post, calling it a “backdoor” or system created to give governments a way to access encrypted data.

“Apple can explain in detail how its technical implementation will keep privacy and security in its proposed backdoor, but in the end, even a well-documented and carefully considered backdoor is still a backdoor. “

Apple’s new system has also been criticized by the company’s competitors, including Facebook’s subsidiary WhatsApp, which also uses end-to-end encryption for some of its messages.

“Instead of focusing on making it easier for people to report content shared with them, Apple created software that can erase all private photos on your phone – even photos you haven’t shared with anyone. WhatsApp President Will Cathcart tweeted.

Afterwards privacy has become an essential part of iPhone marketing. Apple has publicized the security architecture of its systems and is one of the strongest advocates of end-to-end encryption, which means it doesn’t even know the content of messages or other data stored on its servers.

Apple considers this to be a win-win situation

Apple sees the new system as part of its privacy tradition – a win-win situation that protects user privacy while removing illegal content. Apple also claims that the system cannot be redirected to other types of content.

And privacy advocates feel betrayed by Apple, which has billed itself as a deep sink of your data that won’t be shared with anyone because Apple bought a giant billboard in Las Vegas at a trade show. of electronics with the slogan “What happens on your iPhone, it stays on your iPhone.

Apple CEO Tim Cook referred to the “frightening effect” of knowing that what is on your device could be intercepted and examined by third parties. Cook said the lack of digital privacy could cause people to censor themselves even if the person using the iPhone wasn’t doing anything wrong.

“In a world without digital privacy, even if you don’t do anything wrong but think differently, you start to censor yourself,” Cook said in his 2019 graduation speech at Stanford University.

[ad_2]
Source link