Google provides arguments to justify user tracking - the company is accused of insincerity and manipulation

On August 22, 2019, Chrome browser development director Justin Schuh published a feature article , Creating a More Confidential Web . In it:



  1. An initiative has been announced to develop a number of open standards for the “fundamental improvement of privacy on the web” - Privacy Sandbox .

  2. Google’s principles regarding user privacy are declared. Some of them differ from the approaches implemented by the developers of Firefox and Safari. For example, Google puts forward an unexpected argument that blocking cookies is harmful to privacy .


According to some commentators, Google ’s arguments are “ridiculous and unfounded,” and the initiative itself is in some ways insincere and even dangerous. Security researchers Jonathan Mayer and Arvind Narayanan look at “Google’s excuses for tracking users.”



Parsing



“Confidentiality is paramount to us in everything we do,” Google says. “Therefore, we are announcing a new initiative to develop a set of open standards to fundamentally improve privacy on the web.” We called it Privacy Sandbox. ”



“The technology that publishers and advertisers use to make advertising even more relevant to people is now used far beyond its original intent - to the extent that some data processing methods do not meet user expectations for privacy.” - hereinafter quote from a Google document


Google is trying to introduce the thesis that some level of tracking supposedly corresponds to the initial concept of technology and users' expectations about privacy. Neither is true, Mayer and Narayanan write.



Firstly, cookies never provided for tracking by third-party sites, and browsers had to block third-party cookies. This is explicitly stated in the original specifications ( RFC 2109 , section 4.3.5).



Secondly, regarding user expectations for privacy: study by study shows that users do not understand and do not want the ubiquitous web tracking that takes place today.



"Recently, some other browsers have tried to solve this problem, but the lack of an agreed set of standards for improving privacy has undesirable consequences."


This is clearly a reference to the Intelligent Tracking Prevention tracking blocking systems in Safari and Enhanced Tracking Protection in Firefox, which experts consider useful privacy features (about the “undesirable consequences” later).



“Large-scale cookie blocking undermines people's privacy by encouraging opaque methods such as fingerprinting to uniquely identify users. Unlike cookies, users cannot erase their fingerprints and, therefore, cannot control the collection of information. We believe that this undermines the user's choice and is wrong. ”


Mayer and Narayanan propose to evaluate the absurdity of this argument in the following example. Imagine that the local police say: “We see that there is a pickpocket problem in our city. But if we fight pickpocketing, pickpockets will simply switch to robberies. This is even worse. You don’t want this? ”



Specifically, there are several incorrect messages in the thesis of Google. Firstly, the threat of fingerprinting is an argument for taking additional measures to protect against it, and not a reason to give up. In reality, Apple and Mozilla have already taken steps to protect against fingerprinting and are continuing to develop security tools against this method of tracking users.



Secondly, protecting user privacy is not like protecting security. Just because smart cookie bypass is technically possible does not mean that it will be widely used. Firms face great reputational and legal pressure for such practices. Google was convinced of this from their own experience in 2012, when they implemented technology to bypass cookie blocking in Safari. She was noticed, and Google had to settle enforcement with the Federal Trade Commission and state attorney generals. After that, Google completely abandoned tracking cookies for Safari users. Studies show that fingerprinting is rarely used today and there is no evidence of an increase in its use in response to browser actions to block cookies.



Thirdly, even if a large-scale transition to fingerprinting is inevitable (which is not), blocking cookies still provides good protection from third-party tracking using standard technology. This is better than the defeatist approach that Google suggests, the researchers write.



They point out that this is not the first time Google has made insincere arguments that protecting privacy will have the opposite effect: “We call it privacy ghosting . This is an attempt to convince users that the obvious privacy protection accepted by Google’s competitors is not really protection ”[note: gaslighting is a form of psychological manipulation to make people doubt their perceptions of the surrounding reality].



“Blocking cookies without an alternative way of delivering relevant ads greatly undermines the funding sources of publishers, which jeopardizes the future of a vibrant Internet. Many publishers continued to invest in freely available content because they were confident that advertising would cover their costs. If we cut funding, we are concerned about the decrease in the amount of available content. Recent studies have shown that with a decrease in the relevance of advertising, publisher funding falls by an average of 52%.


Researchers see in these words "overt paternalism." Google believes that it knows better than users what kind of privacy they need. Like, people will be better off without privacy.



Regarding the “recent studies” referenced by Google, this is one paragraph in one blog post with Google’s internal measurement results, which were blatantly ignored by the measurement details that are needed to get at least some confidence in the reliability of the results. “And while we are discussing jokes, the international edition of The New York Times has recently switched from behavioral advertising (based on tracking) to contextual ads and geo-targeting - and has not experienced any decrease in advertising revenue,” the researchers write.



“Starting today, we will work with the web community to develop new standards that increase privacy while continuing to maintain free access to content ... Some ideas include new approaches so that advertisements continue to be relevant to users, but at the same time reduce minimized user data shared with websites and advertisers by anonymously aggregating user information and storing much more user information only on the device. Our goal is to create a set of standards that better meet user expectations of privacy. ”


These ideas are nothing new. Privacy targeting has been an active research area for over a decade. One of the authors (Jonathan Mayer) has repeatedly suggested that Google implement these methods during negotiations on the implementation of the Do Not Track standard (2011–2013). Google has consistently insisted that these approaches are not technically feasible.



Researchers emphasize: “If an advertisement uses deeply personal information to address emotional vulnerabilities or uses psychological tendencies to encourage purchases, this is a form of privacy violation - regardless of technical details.”



“We follow the web standards process and seek industry feedback on our initial ideas for the sandbox of privacy. While Chrome can take action quickly in certain areas (such as fingerprint restrictions), developing web standards is a complex process, and we know from experience that changing the ecosystem of this area takes time. They require considerable thought, discussion and input from many stakeholders and usually take several years. ”


Apple and Mozilla have tracking protection enabled by default right now. Meanwhile, Google is talking about a “multi-year process” for some kind of defective privacy protection implementation. And even this is vague: advertising platforms dragged on the tracking standardization process for more than six years, without any significant result. If history teaches something, then starting the standardization process is an effective way for Google to show activity in the field of privacy protection, but without actual actions in this regard, the authors believe.



Researchers are convinced that there are many smart engineers among Chrome developers who are passionate about protecting users, and they have done an incredible job of web security. But it is unlikely that Google can provide privacy on the Internet, as it protects its business interests, and Chrome continues to lag behind Safari and Firefox.



Here they cite an excerpt from Shoshana Zuboff's book, The Age of Surveillance Capitalism :



“Requiring confidentiality from surveillance capitalists or lobbying for the cessation of commercial surveillance on the Internet is like asking old Henry Ford to collect each Model T by hand. It's like asking a giraffe to shorten its neck or stop cowing. "These requirements are existential threats that violate the basic mechanisms of survival of the subject."

The authors are only disappointed with the fact that the Chrome developers provide insincere technical arguments to hide Google’s business priorities.







All Articles