Deconstruct Google's apology on protection from tracking



[ad_1]

By Jonathan Mayer and Arvind Narayanan.

Block cookies is wrong for privacy. This is the new fallacy of Google, which tries to justify why Chrome is so far behind Safari and Firefox in terms of privacy. As researchers who have been studying Web tracking and online advertising for over a decade, we want to make things clear.

Our high-level points are:

1) The blocking of cookies does not affect the confidentiality on the Web. The opposite of Google is the confidentiality of personal data.

2) There is little reliable evidence on the comparative value of tracking-based advertising.

3) Google has not come up with an innovative way to balance privacy and advertising. he clings to earlier approaches that he had previously declared impractical.

4) Google is trying to cripple the Web standardization process, which will be years away from it.

The following is a reproduction of excerpts from yesterday's announcement, commented by our comments.

The technology used by publishers and advertisers to make advertising even more relevant to Internet users is now being used far beyond its original purpose – to the point that some data practices do not match users' expectations of confidentiality.

Google is trying to thread a needle here, implying that a certain level of tracking is consistent with the initial design intention of Web technology and the expectations of users' privacy. Neither is true.

Let's be clear: cookies should not allow tracking by third parties and browsers were supposed to block third-party cookies. We know this because the authors of the technical specification of the origin of the cookie have said (RFC 2109, Section 4.3.5).

Similarly, if privacy expectations are the benchmark, be clear: study after study, it has been shown that users do not understand and do not want the ubiquitous Web tracking that occurs today.

Recently, some other browsers have attempted to solve this problem, but without an agreed set of standards, attempts to improve user privacy have unintended consequences.

This is clearly a reference to Safari Intelligent Tracking Prevention and Firefox Enhanced Tracking Protection, which we believe are laudable features of privacy. We will come to the claim of unexpected consequences.

First, large-scale blocking of cookies harms the privacy of individuals by encouraging opaque techniques such as fingerprinting. With fingerprints, developers have found ways to use tiny information, such as their device or installed fonts, to generate a unique identifier that can then be used to match a user to a website. Unlike cookies, users can not erase their fingerprint and can not control how their information is collected. We think that it compromises the user's choice and that it is wrong.

To appreciate the absurdity of this argument, imagine the local police saying, "We find that our city has a pickpocket problem. But if we tackle pickpocketing, pickpockets will simply switch to aggression. It would be even worse. Surely you do not want that, do you?

In concrete terms, Google's argument has several flaws. First of all, although fingerprints are an invasion of privacy, it is an argument for taking additional steps to protect users, rather than raising their arms in the air. Apple and Mozilla have already taken steps to limit fingerprints and continue to develop anti-fingerprints.

Secondly, the protection of consumer privacy is not synonymous with the protection of security – the mere fact that intelligent circumvention is technically possible does not mean that it will be deployed on a large scale. Businesses face immense legal and reputation pressures to avoid blocking cookies. Google's privacy issues in 2012 are a perfect example of this: Google has implemented a Safari cookie blocking workaround; he was spotted (in part by one of us), and he had to settle enforcement actions with the Federal Trade Commission and state attorneys general. After that, Google did not double: it completely avoided following Safari users' tracking cookies. Based on peer-reviewed research, including ours, we believe that fingerprints continue to account for a small proportion of overall web tracking. And nothing indicates that fingerprints are increasingly used in response to the blocking of cookies by other browsers.

Third, even though a large-scale change to fingerprints is inevitable (which is not the case), blocking cookies still provides significant protection against third parties who stick to traditional tracking cookies. It's better than the defeatist approach proposed by Google.

This is not the first time Google has used fallacious arguments to suggest that privacy protection will backfire. We call this the "Privacylight" movement, because it's about trying to convince users and policy-makers that an obvious privacy protection – already adopted by Google's competitors – is not really a protection of privacy. private life.

Secondly, blocking cookies with no other means of delivering relevant ads significantly reduces publishers' primary funding, which threatens the future of the dynamic Web. Many publishers have been able to continue to invest in freely accessible content as they can rest assured that their advertising will fund their costs. If this funding is reduced, we are afraid to see much less accessible content for everyone. Recent studies have shown that when advertising is made less relevant by removing cookies, publisher funding decreases by an average of 52%.

The paternalism declared here is disappointing. Google believes that it knows more than users: if they had all the confidentiality, they would not get the free content they wanted more. So, no privacy for users.

For the "recent studies" that Google refers to, it would be a paragraph in a blog post with an internal measurement by Google. There is a flagrant omission of the details of the measure necessary to have any confidence in the demand. And as long as we compare the anecdotes, the international edition of the New York Times recently moved from behavioral ads based on tracking to contextual and geographic ads – and saw no drop in ad revenue.

Independent research does not support Google's claim either: the latest academic study suggests that tracking adds only about 4% to publishers' sales. This is a subject that deserves a lot of study and it is misleading for Google to choose its own internal measure. And it is important to distinguish the economic problem from whether traffic tracking benefits advertising platforms like Google (which it does unambiguously) from the economic problem of tracking publisher profit (which is not clear).

From today's announcements, we will work with the web community to develop new standards that enhance privacy, while continuing to support free access to content. In recent weeks, we have begun sharing our preliminary ideas for a Privacy Sandbox – a secure environment for customization that also protects users' privacy. Some ideas include new approaches to ensure that ads remain relevant to users, but user data shared with websites and advertisers would be minimized by anonymously aggregating user information and retaining much more user information. on the device only. Our goal is to create a set of standards that is more consistent with users' expectations of privacy.

There is nothing new in these ideas. Targeting privacy-preserving ads has been an active area of ​​research for more than a decade. One of us (Mayer) has repeatedly urged Google to adopt these methods during Negotiations Do not Track (circa 2011-2013). Google's response has been to always insist that these approaches are not technically feasible. For example: "In simple terms, client-side frequency cap does not scale." We are pleased that Google is taking this direction more seriously now, but some late thinking does not make much progress.

We are also disappointed that this announcement implicitly defines privacy as confidentiality. She does not know that for some users the problem of privacy is the behavioral targeting of ads, not the web tracking that allows it. If an ad uses deeply personal information to appeal to emotional vulnerabilities or exploits psychological tendencies to generate a purchase, then it is a form of privacy violation, regardless of the details techniques.

We are following the web standardization process and soliciting industry feedback on our initial ideas regarding the Privacy Sandbox. Although Chrome can act quickly in certain areas (fingerprint restrictions, for example), Web standards development is a complex process, and we know from experience that the changes in the web are not easy. an ecosystem of this magnitude takes time. They require many reflections, debates and contributions from many stakeholders and usually last several years.

Apple and Mozilla have tracking protection enabled, by default, today. And Apple is already testing the extent of ads preserving privacy. Meanwhile, Google is talking about a multi-year process for a diluted form of privacy protection. And even that is uncertain – the advertising platforms have dragged the process of normalization Do not follow for more than six years, without any significant result. If history is to be believed, launching a standardization process is an effective way for Google to feel like it is doing something about web privacy, but not delivering the result.

In conclusion, we wish to emphasize the fact that the Chrome team is full of intelligent engineers passionate about protecting their users and that she has done an amazing job in web security. However, it is unlikely that Google can provide meaningful privacy on the Web while protecting its business interests. Chrome continues to be far behind Safari and Firefox. We find this passage of Shoshana Zuboff The era of capitalism under surveillance to be able:

"Asking the privacy protection of the capitalists for surveillance or pushing for the end of commercial surveillance on the Internet is like asking old Henry Ford to make each T model by hand. It's like asking a giraffe to shorten one's neck or a cow to stop chewing. These demands are existential threats that violate the fundamental mechanisms of the survival of the entity. "

It is disappointing, but unfortunately not surprising, that the Chrome team conceals Google's business priorities in the form of unfair technical arguments.

Thanks to Ryan Amos, Kevin Borgolte and Elena Lucherini for their comments on a rough draft.

[ad_2]

Source link