Apple sold the children's privacy yesterday under the pretext of "on-screen" applications



[ad_1]

I am far from being an Apple hater – I applaud many of the company's decisions about its products and the more complete state in which it tends to execute them (except for keyboards , apparently). But yesterday, at the announcement of WWDC announcements, Apple stalled the debate over a controversy and seriously messed up the company's message about privacy.

Last year, after the introduction by iOS integrated parental controls in Apple, Apple began to pick in a similar way do not functionally the same – apps from the App Store. This was done according to what I consider to be a perfectly fair reasoning: these applications all used VPNs or the iOS MDM (an enterprise device management system) iOS explicitly to monitor the activity of a person on their iOS device. Some of the activities being followed are truly innocent, such as authentic screen time applications simply intended to enforce the rules of parents regarding the use of the phone by their children. Others, however, do exactly what you expect from an indistinguishable application of one designed to spoof maliciously and silently. Full-time position tracking, geolocation, explicit text and image content alerts, Web browsing history (and filtering), and more.

One of the most terrifying solutions is Bark, which comes as a modern, privacy-friendly parenting tool, but actually uses AI (and probably, to some degree, humans) to alert parents whenever A text, photo or website contains potentially harmful information. "Content is detected." Looking at all the praise that the media and parents have drawn from the app, you might be inclined to think that it's OK. "After all, Bark does not let parents read the full chat log, only those that trigger red flags, but the fact that an app like Bark may even exist while Apple claims that privacy is a priority on iOS is just ridiculous. which can extract images, private texts, emails, web history, call logs and location data, send it to an untrusted third-party cloud for analysisand then send it back to the parents so that they get to know it without the knowledge of their child, in accordance with this ethic? In reality, Bark is only turning and polishing what is a raw model: Spy on your kids – now easier and faster with AI! (Bark suggests telling your kids that you're spying on them for what it's worth.)

And while all of these app developers claim that they only want to ensure kids have a secure online experience without having their parents mingle with their phone in the middle of the night, this is at best a two-card faces. These apps can just as easily be used to monitor and track a spouse, and I am absolutely convinced that they are used this way. Bark, for example, detects explicit images or content – exactly the kind of feature that a jealous, paranoid spouse would die. Some applications, such as mSpy, which still has not been banned from the App Store (Bark has been reinstated), do not even try to hide behind a polish of privacy. : they directly announce to be able to read texts. Others, like SafeLagoon, offer all the images of a fun, user-friendly and security-oriented service, but are already announcing the return of full-time chat and SMS control on their iOS app as " to come ", presumably in the wake of Apple's policy. change.

Applications like SafeLagoon come first as security and screen time protection tools – but in reality, they are powerful spyware platforms.

I want to be clear: I do not tell anyone how to educate their children. If you want to spy on your kids, that's your business. My problem is that these applications also allow abusers and those seeking to control and monitor others in a harmful way. It's not hard to see how an application like mSpy or SafeLagoon could be useful to a trafficker of human trafficking, but I will not go into the dark and terrible ways in which these services could be exploited in this message – use your imagination is probably enough.

Given this, it's impossible to reconcile the fact that Apple really cares about data privacy. alreadythere user and done all he can make sure that their data remains private. But why did Apple collapse? What caused him to radically change his ideology regarding these applications of parental espionage? After all, they clearly do not pose any financial threat to Apple. The New York Times thinks the antitrust concerns have motivated the change, but it's really impossible to know – Apple does not speak.

It is possible that competitive disadvantage may have played a role. Families often choose together ecosystems: everyone is on iOS or everyone is Android (in the United States, usually Samsung). This can make parents' decisions regarding larger smartphones in terms of effect, especially since it becomes likely that their children will use the ecosystem with which they grew up, a phenomenon that which is constantly visible in the world we live in today. Momentum is an extremely powerful force that keeps users in product ecosystems, and platform locking is a powerful tool for keeping them alive – an Apple company has proven to be particularly effective. Saying that Samsung phones (or any other Android device) provide much better "parental supervision" utilities can really prove to be a decisive factor in some households. Although this number is small compared to all of Apple's activities, at a time when the company is desperate to turn to services while iPhone sales stagnate, any loss of momentum could be harmful. .

Of course, all this is just a theory – Apple does not say why she reversed the policy or, curiously, why she did not simply implement a control screen API, as many of companies concerned. But Apple has now made it clear that its all-the-time-that-is-at-a-huge-exception-one privacy model, which puts a potentially huge amount of data in the hands of companies that we know little. Bark claims to have digitized billions of SMS messages on millions of phones. How does it work when Apple tries to tell us how secure and private iMessage is? The company has even recently launched a video ad about it. The dissonance is incredible.

Note: Google allows these apps to flourish on the Play Store without a problem, and I would certainly like the company to take a stand. Monitoring wholesale devices should not be as easy as secretly installing an app on someone's phone an unlocked, regardless of your parenting philosophy. Whatever the legitimate use cases of parenting, these apps do not outweigh the potential for abuse or mismanagement of the data in my mind, and I think that Apple and Google should consider at length and for a long time what it means to "protect" the privacy of children.

[ad_2]

Source link