Forget, Erase & Delist – But Don’t Forget The Broader Issue

Current State of Affairs

A new year, a new CPDP Conference (Computers, Privacy & Data Protection, 21-23 January 2015). In the past 12 months we have seen privacy and data protection issues taking a much more prominent role in many different internet policy discussions. One of the key examples in this regard is the so-called Google Spain case by the Court of Justice of the EU (CJEU). Acknowledging the right of individuals to ask search engines to delist certain name-based search results, the ruling sent shockwaves through internet policy circles. If anything, the case has made us all re-think the balance between different fundamental rights and interests, the allegedly ‘neutral’ role of search engines and the extra-territorial reach of local regulations. Besides the unprecedented public debate and media coverage, the CJEU’s decision also resulted in the Article 29 Working Party publishing interpretation guidelinesand Google setting up its very own ‘Advisory Council’ which held public hearings across the EU.

Unsurprisingly, the Google Spain ruling is usually talked about against the backdrop of the so-called ‘Right to be Forgotten’. This ‘right’ has been criticised fiercely by freedom of expression advocates and is emblematic of the fissure between the US and EU regarding online privacy policy-making. Nonetheless, there is at least one point all sides seem to agree on: the terminology is very problematic. Hence, it is great to see that the latest draft of the EU’s ‘Proposal for a Data Protection Regulation’ simply refers to the ‘right to erasure’ (Article 17). Still, the provision has been attacked for being unclear on both its scope and how it is to be implemented. But can we really be that upset about this? Isn’t the exact goal of legal norms to put forward an abstract – and especially future-proof – principle that should be interpreted differently, depending on the relevant facts and context of each case? Shouldn’t we rather look at the judicial (courts) and executive (e.g. Data Protection Authorities) branches to help make sense of the rules put forward by the legislator?

Some Thoughts on Balancing

Requesting the removal of certain information (on whatever legal ground) will always generate a conflict of interests and rights. In the context of the ‘right to be forgotten/erasure’ debate, the most recurring conflicts relate to either the right to freedom of expression (Article 11, CFREU) or economic freedoms (Article 16, CFREU). The ‘conflict’ between privacy and freedom of expression interests, however, is immensely inflated (to the great benefit of the big data industry).

As most readers will know, Europe has a rich legal tradition of balancing the two rights (most notably in the case law of the European Court of Human Rights), with clear criteria and safeguards for restricting either right. As a side note, one can only applaud the CJEU in Google Spain for clearly distinguishing the responsibilities and protections of actual speakers (i.e., newspapers) from third parties (i.e., search engines), each subject to a different balancing test.

Additionally, if you look at Google’s transparency report, it becomes clear that the majority of delisting requests does not relate to legitimate news reporting websites in the first place. Instead, most requestors seem to be average Joe’s concerned about how websites like 123people.com or facebook.com gratuitously show information about them on the basis of a mere name search. In short, the ‘right to be delisted’ is about online obscurity, not about eradicating information from the internet altogether.

More importantly, we should not be swayed by ‘right to be forgotten’ rhetoric professing that it constitutes a fundamental threat to freedom of expression online. The right to erasure has a much more important – and largely understated – goal: empowering data subjects with regard to their data being harvested and exploited ‘behind the scenes’ (e.g. for commercial/political profiling, digital market manipulation, etc.). No conflict with the right to freedom of expression exists in these contexts. Instead, these situations usually require a balancing exercise between individuals’ privacy and data protection interests on the one hand and the data controller’s economic freedoms (Article 16, CFREU) on the other. With regard to the latter, the Google Spain case made clear that such freedoms weigh considerably less when compared to freedom of expression interests. Still, neither fundamental right/interest can be discarded without giving it due regard in the balancing exercise first.

Conclusion

The right to erasure undoubtedly results in a conflict of rights/interests that needs to be solved. Whereas many cases will be very straightforward, a considerable portion will require a more thorough balancing exercise. We should be wary, however, not to be blinded by the rhetoric of freedom of expression absolutists, libertarians and corporate lobbyists defending their own agendas. We should not want to reinvent the wheel either. Balancing exercises and the proportionality principle are deeply embedded in the European legal framework. Applying them to the issue(s) at hand might not always be straightforward. But is it really asking too much from entities that, ultimately, benefit from using personal data?

Finally, the whole debate on the ‘Right to be Forgotten’, the ‘Right to Erasure’ and the GoogleSpain case seems to be far from finished. I hope you join me in congratulating Computers, Privacy & Data Protection (CPDP) for having provided a fertile platform (books and panels) for discussing these issues.

*This BlogPost originally appeared as an Op-Ed on the Internet Policy Review*

Google v Spain at the Court of Justice of the EU

One week ago, the so-called ‘Google v Spain’ or ‘Right to be Forgotten’ case was heard before the Court of Justice of the EU (C-131/12).

Put briefly, Google was ordered by a Spanish Court to remove certain search results – relating to Mr. Carlos José – from its index. The contentious search results linked back to Mr. José’s insolvency proceedings, published in a newspaper in 1998. Google appealed and the Audiencia Nacional referred to the ECJ, lodging for a preliminary ruling on the three main questions in this case: (1) Territoriality: do the contentious facts fall within the territorial scope of application (art.4) of the Data Protection Directive (DPD)? (2) Can Google – as a Search Engine (SE) – be considered a (personal) data controller? (3) What is the extent of the ‘Right to be Forgotten’ aka ‘right to erasure and blocking of data’? Currently there are over 200 similar cases pending before Spanish Courts (and not unlikely in other EU jurisdictions as well).

SUMMARY OF MAIN ARGUMENTS:

Google

(1) Territoriality

▪       The contentious activity at stake here is not ‘carried out in the context of’ the activities of Google Spain – being an EU establishment of Google Inc. (Article 4(1)(a) DPD). Google Spain is not involved in the SE activities itself (Google Inc is). Its only relevant activities with regard to the SE is to provide advertising space and their behavioural advertisement model is not based on the indexed content.

▪       Art.4(1)(c) is not applicable either, as the mere provision of a service in an EU Member State (even if Spanish domain name is used) cannot be considered ‘use of equipment’ within the meaning of the DPD. The use of web spiders to index content should not be considered ‘use of equipment’ either. The use of cookies would constitute ‘use of equipment’ but is not relevant in this case.

(2) Google as Controller

▪       Google collects, processes and indexes data indiscriminately. They are ignorant about whether or not the content of the webpages that is being indexed contains personal data. There is an obvious lack of intent that distinguishes this case with the Lindqvist and Satamedia cases.

▪       The decision to take content down should be taken by the party that is best placed to do so. Given the fact that Google does not control any of the data held on the websites it indexes, nor that it has any intent to do so, the publisher of the original webpage is best placed to decide.

▪       Even if one would consider Google to be processing personal data, the company still argues not to be a data controller because: (a) there is no intent to process personal data; (b) Google does not verify whether indexed data is personal or not; (c) the publisher has final and continuing control; (d) if the publisher removes the (personal) data, Google does so as well; (e) Google cannot verify the legitimacy personal data processing; (f) Google only plays a secondary/accessory role in disseminating the information; (g) articles 12-15 eCommerce Directive; (g) Article 29 Working Party’s Opinion 1/2008 (p13-14) endorses SE’s role as intermediaries in situations such as the one at hand; (h) Google’s role can be compared to that of a Telecom operator who are stricto sensu  also processing personal data, but or not liable under the DPD either. Mere intermediaries transferring data.

(3) Right to be Forgotten/Erasure v-a-v Search Engines

▪       On the question whether a SE can be asked to remove links directly, without the data subject first having to go to the original publisher first, Google raises the Freedom of Expression (FoE) and Freedom of Information (FoI) flag: (1) the original publishers will be deprived of an important channel of communication and (2) Internet users in general will have access to less information. The responsibility of publishers should not be shifted onto the shoulders of intermediaries such as search engines. This would also violate the proportionality principle, Google argues: (a) Obliging Google to erase search results does not prevent the information form appearing elsewhere; (b) Google cannot assess the legality; (c) Google can only remove the link to the webpage entirely (it cannot just anonymise certain bits in the webpage), which would constitute overkill as the webpages will usually contain much more information than just the contentious personal data.

Plaintiff (+ allies: Spanish & Austrian Government, European Commission)

(1) Territoriality

▪       For the DPD not to apply to the issue at stake, the question is whether or not Google Spain’s activities can be sufficiently distinguished from those of Google Inc. This is clearly not the case, according to the plaintiff (and its allies). Google Spain’s activity is not merely ancillary, but constitutes an integral part of Google Inc’s activities. They are just doing it for a particular jurisdiction.

 (2) Google as Controller

▪       DPD was written before the widespread use of the Internet and SE’s in particular. The DPD should, therefore, be applied creatively to Google. According to the plaintiff, Google is a data controller as it actively collects and processes personal data (referring to art.2(b) DPD ‘dissemination or otherwise making available’ and Lindqvist: any processing of data – even if data is published already – constitutes PD processing within scope of DPD). Its activity constitutes a separate ‘publication’ from the original one.

▪       Google can even be considered a data controller v-a-v the content of the webpages it indexes, because: (a) it determines the means (algorithms, web spiders, etc.) and purpose (include information in search results) of processing; (b) Google actively directs and controls the flow of information and its actions cannot be compared to bidirectional traffic management of a telecom operator. In other words, they are not ‘neutral’ intermediaries. Google provides an added value service, which it cannot provide without acting autonomously.

▪       It was also argued that the criteria of art2(a) and (b) are ‘objective’ in nature. The intent of the ‘controller’ is not relevant. Hence SE’s are data controllers as they are de facto processing personal data.

▪       Google – allegedly – also has a separate responsibility because it makes information much more widely available, it can provide a complete picture of an individual and it has its own specific (commercial) purposes. Google does not “control” the initial uploading of content, but it does control its aggregation and subsequent dissemination. Responsibility of SE’s is distinct and separate.

(3) Right to be Forgotten/Erasure v-a-v Search Engines

▪       It is stressed that Google is not asked to conduct a priori monitoring of all the content it indexes. The plaintiff (and its allies) rather advocate for a specific notice and takedown regime, similar to copyright claims. Only when a specific complaint is made, regarding a specific piece of content, Google should remove the search results.

▪       In order to invoke such a right, the data subject should at least demonstrate a violation of the legitimate processing requirement (art.7 DPD) or the principle of data quality (art.6 DPD).

▪       On the risk to FoI and FoE: When there is a conflict between different rights, a balance should be made. Neither one should automatically prevail.

After this first round of arguments, the court asked several questions to all parties involved. Most of them related to the practical implications of a potential obligation on Google to remove search results.

The Advocate General will finish his Opinion by June 25th, and a final judgement should follow soon thereafter.

Together with some colleagues at ICRI, I will soon publish a working paper making a more thorough legal analysis of all the issues at stake in this particular case. To be Continued…