The Right to be Forgotten – It’s about time, or is it?

[Brief summary of my presentation at the CPDP 2014 panel on “Timing the Right to be Forgotten”. Slides: See Below]

The panel took a really refreshing perspective on the Right to be Forgotten debate. So I was glad to take this opportunity to look more closely at what role ‘time’ actually plays in the legal framework relevant to the so-called ‘Right to be Forgotten’.

In short, the presentation aimed to identify some of the relevant legislations and case-law, with a particular focus on the general right to privacy and the data protection framework.

Terminlogical Issue – Over the past few years, the so-called ‘Right to be Forgotten’ seems to have been used as some sort of umbrella term to refer to different situations and different legal regimes (general right to privacy, right to personal portrayal, data protection, defamation, etc).

General Right to Privacy – When looked at in the context of the general right to privacy (8 ECHR), it is usually applied to shield individuals from being confronted with certain aspects of their past in a disproportionate, unfair or unreasonable way (classic example: ex-convict who is confronted with his/her past in the media, years after the facts). Because it is primarily invoked in situations where an individual’s personal life is publicly exposed, usually by the media, a careful balancing exercise with other fundamental rights will be imperative. One of the key criteria in making this balance will often be to look at how much time has passed. In the Österreichischer Rundfunk v Austria Case, for example, the ECtHR specified that the lapse of time since a conviction and release constitutes an important element in weighing an individual’s privacy interests over the public’s interest in publication. But, in another case, concerning the publication of a book by the private doctor of former French President Mitterand, the Court held that the lapse of time was an argument in favour of the public’s interests over the privacy and medical confidentiality protections of the ex-President.

Data Protection Law – When based on the data protection framework, the right to be forgotten – or rather right to erasure – seems to be more mechanical and straight-forward. At least in theory. Under the current Directive, the right can be invoked when the data processing “does not comply with the provisions of the Directive, in particular because of the incomplete or inaccurate nature of the data” (art.12). In other words, it looks like the data subject could invoke his/her right to erasure when the controller fails to fulfil its obligations or ignores data subjects’ rights. Keeping mind the concept of ‘Time’, three of the most relevant elements, probably are (1) the purpose specification and use limitation principle, (2) the need for a legitimate ground and (3) the data subject’s right to object.

The purpose specification principle actually constitutes some sort of benchmark against which the processing of personal data will be assessed over time. Besides having to be be specific and explicit, the purpose also has to be legitimate. It goes without saying that the legitimacy of the purpose of processing can evolve over time, depending on a variety of factors. On top of that, over time the personal data might become unnecessary, irrelevant or inadequate to achieve the original (or a compatible) purpose (for more information, check the Article 29WP Opinion 2/2013 on Purpose Limitation).

Secondly, the processing activities will permanently have to be tested against the legitimacy grounds in article 7 of the Directive. This is particularly relevant when the processing is based on the last legitimacy ground, which requires a careful balance to be made between all rights and interests at stake. These might, of course, evolve over time as well.

Thirdly, in principle the right to erasure can also be invoked when the data subject has successfully exercised his/her right to object. In order to exercise one’s right to object, it is necessary to put forward compelling and legitimate grounds (relating to one’s particular situation). It goes without saying that these grounds can include a variety of factors, among which time is one.

In the currently still pending Google Spain Case before the Court of Justice of the EU, for example, one of the primary arguments of the original plaintiff was the passing of time.The  National Court  explained that today, it is possible to create very detailed personal profiles in just a couple of clicks, with information that used to be difficult to find. The lack of territorial and temporal limitations to the dissemination of information constitutes a danger to the protection of personal data. The Court further specified that originally lawful and accurate personal data may become outdated overtime in the face of new events. Some of this information might actually generate social/professional/personal harm to the individual.

Finally, a few words about the draft Data Protection Regulation.  Article 17 on the Right to be Forgotten and to Erasure – already rebranded to the pre-existing right to erasure – specifically aims to give (back) some control to data subjects over their data. Without wanting to go into detail on this provision (which does not add that much to the existing regime, but rather emphasises existing rights and obligations), it is worth highlighting that the article does refer to the concept of ‘Time’ in paragraph 7. This provision stipulates that the controller should “implement mechanisms to ensure that the time limits established for the erasure of personal data […] are observed. The Regulation also requires these time limits are to be specified in the information provided to data subjects (art.14(1)(c).

Concluding. First of all, technology makes it ever more more easy to store and find old information. Just think of the digitisation of old archives, facial recognition, geo-tagging, etc. This trend evidently upsets an increasing amount of individuals. Depending on the relevant facts in each case, a number of legal frameworks might be used to request certain information to be removed. The general right to privacy seems to be particularly used in situations where private information is made public (again) by the media. From ECtHR (and national) case-law it can be deduced that the time-factor can either play in favour of removing the information (when deemed irrelevant, see Österreichischer Rundfunk v Austria Case) or in favour of keeping the information available (when entered in the public domain or when the information is of particular relevance in light of current events, see Aleksey Ovchinnikov v. Russia and Editions Plon v. France). In any case, it seems that from all legal frameworks that might be applicable, data protection law in particular constitutes an increasingly attractive route to take. Not only does it have a broad scope of application, but unlike most other regimes, it does not require falsehood, malicious intent or even widespread publicity

Regardless of what legal regime is used, it seems that in virtually all of these cases, a balance of interests and rights will have to be made. And in quite a few situations time will be a relevant factor to take into account. To give yet another recent example, it is worth referring to the Advocate General’s opinion in the DRI & Seitlinger Case before the Court of Justice (C‑293/12; C‑594/12), released just last month. In this Opinion, the AG explicitly claimed that the Data Retention Directive is incompatible with the Charter of Fundamental Rights. One of the reasons he put forward was that the Directive does not respect the principle of proportionality, in requiring data retention for up to two years. Although the Directive’s ultimate objective is perfectly legitimate, the AG argued, there is no justification for extending the data retention period anything beyond one year.

So, in short, it seems that the passing of time can be used to argue both ways – for or against removal. The importance of ‘time’ in determining the merits of removing information will be different in each individual case, but should not be overestimated either. Eventually, time will just be another factor in assessing the balance of rights and interests.

According to the Advocate General, Mr Cruz Villalón, the Data Retention Directive is incompatible with the Charter of Fundamental Rights

“In his Opinion delivered today, Advocate General Pedro Cruz Villalón, takes the view that the Data Retention Directive1 is as a whole incompatible with the requirement, laid down by the Charter of Fundamental Rights of the European Union, that any limitation on the exercise of a fundamental right must be provided for by law.”

Press Release

Intermediary Liability – Automation = Neutrality = Exempted?

Tomorrow, Advocate General Jääskinen is to release his opinion in the much debated Google v Spain (aka Right to be Forgotten) case. According to Google’s adversaries, Search Engines are to be considered data controllers under the Data Protection Directive for the personal data (on the websites) they refer to and are therefore (under certain circumstances) liable to remove links (for more info, see my previous blogpost on this case).

An often invoked counter-argument to liability assertions by Internet intermediaries relates to the automated nature of their processing activities. In other words, Intermediaries often argue that they are merely providing a neutral service, content-agnostic and fully automated. After all, it is claimed, decisions are made by algorithms and no human eyes actually ‘see’ the information. In 2010 the CJEU seems to have acknowledged such an interpretation, stating that services that only perform activities ‘of a mere technical, automatic and passive nature … having neither knowledge of nor control over the information which is transmitter or stored’ should be exempted (Louis Vuitton v Google, C-236/08). In the UK, Justice Eady has also ruled that automation precludes intentionality (In this case, Google was not held liable for defamatory snippets it displayed in its search results.). The argument that automation equals neutrality, however, seems to be falling apart. Being a mere (automated) ‘organizing agent’ does not necessarily entail neutrality, nor does it necessarily validate the exemption of liability. After all, as U. Kohl aptly describes: “both organization and automation require human judgment and thus have assumptions, values and goals embedded into them.” Put differently, the use of algorithms does not imply neutrality. Instead of looking at the automated nature of an intermediary service provider, one should look at how it is designed. Such interpretation was also followed by AG Jääskinen in L’Oreal v eBay. In his Opinion, he clearly stated that neutrality does not even constitute the right test to decide on the exemption from liability. Instead, one should look at the type of activities of the relevant service provider. The liability of a search engine, for example, may depend on the fact whether it simply refers to information, displays snippets or autosuggests the information. Particularly with regard to defamatory content, a considerable amount of – widely diverging – cases has emerged over the last few years. A number of French Courts have ruled search engines to be liable for auto-suggesting information. Very recently, the German Federal Court of Justice also overturned two decisions, requiring Google to block defamatory auto-complete search suggestions. In a 2011 Italian judgment, the Court followed the plaintiff’s claim that Google should be considered to have ‘created’ the auto-complete terms. Even if not considered to actually create the actual terms, search engines still make a deliberate decision to adopt autocomplete functionality and design the underlying algorithms. Moreover, it is particularly hard to rely on the neutrality argument (based on automation and alleged content-agnosticism) after the intermediary has received a notification (infra).

Put briefly, we see a crumbling of the ‘automation=neutrality=exemption-from-liability’ argument. Particularly taking into account AG Jääskinen’s previous statements in L’Oreal v Google (specifically expressing concern on the argument that “in order to establish whether the liability of a paid internet referencing service provider may be limited under Article 14 of Directive 2000/31, it is necessary to examine whether the role played by that service provider is neutral in the sense that its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores”), it is most likely that he will come to a similar conclusion in his Opinion in the Google v Spain Case which will – normally – be released tomorrow.

Google v Spain at the Court of Justice of the EU

One week ago, the so-called ‘Google v Spain’ or ‘Right to be Forgotten’ case was heard before the Court of Justice of the EU (C-131/12).

Put briefly, Google was ordered by a Spanish Court to remove certain search results – relating to Mr. Carlos José – from its index. The contentious search results linked back to Mr. José’s insolvency proceedings, published in a newspaper in 1998. Google appealed and the Audiencia Nacional referred to the ECJ, lodging for a preliminary ruling on the three main questions in this case: (1) Territoriality: do the contentious facts fall within the territorial scope of application (art.4) of the Data Protection Directive (DPD)? (2) Can Google – as a Search Engine (SE) – be considered a (personal) data controller? (3) What is the extent of the ‘Right to be Forgotten’ aka ‘right to erasure and blocking of data’? Currently there are over 200 similar cases pending before Spanish Courts (and not unlikely in other EU jurisdictions as well).

SUMMARY OF MAIN ARGUMENTS:

Google

(1) Territoriality

▪       The contentious activity at stake here is not ‘carried out in the context of’ the activities of Google Spain – being an EU establishment of Google Inc. (Article 4(1)(a) DPD). Google Spain is not involved in the SE activities itself (Google Inc is). Its only relevant activities with regard to the SE is to provide advertising space and their behavioural advertisement model is not based on the indexed content.

▪       Art.4(1)(c) is not applicable either, as the mere provision of a service in an EU Member State (even if Spanish domain name is used) cannot be considered ‘use of equipment’ within the meaning of the DPD. The use of web spiders to index content should not be considered ‘use of equipment’ either. The use of cookies would constitute ‘use of equipment’ but is not relevant in this case.

(2) Google as Controller

▪       Google collects, processes and indexes data indiscriminately. They are ignorant about whether or not the content of the webpages that is being indexed contains personal data. There is an obvious lack of intent that distinguishes this case with the Lindqvist and Satamedia cases.

▪       The decision to take content down should be taken by the party that is best placed to do so. Given the fact that Google does not control any of the data held on the websites it indexes, nor that it has any intent to do so, the publisher of the original webpage is best placed to decide.

▪       Even if one would consider Google to be processing personal data, the company still argues not to be a data controller because: (a) there is no intent to process personal data; (b) Google does not verify whether indexed data is personal or not; (c) the publisher has final and continuing control; (d) if the publisher removes the (personal) data, Google does so as well; (e) Google cannot verify the legitimacy personal data processing; (f) Google only plays a secondary/accessory role in disseminating the information; (g) articles 12-15 eCommerce Directive; (g) Article 29 Working Party’s Opinion 1/2008 (p13-14) endorses SE’s role as intermediaries in situations such as the one at hand; (h) Google’s role can be compared to that of a Telecom operator who are stricto sensu  also processing personal data, but or not liable under the DPD either. Mere intermediaries transferring data.

(3) Right to be Forgotten/Erasure v-a-v Search Engines

▪       On the question whether a SE can be asked to remove links directly, without the data subject first having to go to the original publisher first, Google raises the Freedom of Expression (FoE) and Freedom of Information (FoI) flag: (1) the original publishers will be deprived of an important channel of communication and (2) Internet users in general will have access to less information. The responsibility of publishers should not be shifted onto the shoulders of intermediaries such as search engines. This would also violate the proportionality principle, Google argues: (a) Obliging Google to erase search results does not prevent the information form appearing elsewhere; (b) Google cannot assess the legality; (c) Google can only remove the link to the webpage entirely (it cannot just anonymise certain bits in the webpage), which would constitute overkill as the webpages will usually contain much more information than just the contentious personal data.

Plaintiff (+ allies: Spanish & Austrian Government, European Commission)

(1) Territoriality

▪       For the DPD not to apply to the issue at stake, the question is whether or not Google Spain’s activities can be sufficiently distinguished from those of Google Inc. This is clearly not the case, according to the plaintiff (and its allies). Google Spain’s activity is not merely ancillary, but constitutes an integral part of Google Inc’s activities. They are just doing it for a particular jurisdiction.

 (2) Google as Controller

▪       DPD was written before the widespread use of the Internet and SE’s in particular. The DPD should, therefore, be applied creatively to Google. According to the plaintiff, Google is a data controller as it actively collects and processes personal data (referring to art.2(b) DPD ‘dissemination or otherwise making available’ and Lindqvist: any processing of data – even if data is published already – constitutes PD processing within scope of DPD). Its activity constitutes a separate ‘publication’ from the original one.

▪       Google can even be considered a data controller v-a-v the content of the webpages it indexes, because: (a) it determines the means (algorithms, web spiders, etc.) and purpose (include information in search results) of processing; (b) Google actively directs and controls the flow of information and its actions cannot be compared to bidirectional traffic management of a telecom operator. In other words, they are not ‘neutral’ intermediaries. Google provides an added value service, which it cannot provide without acting autonomously.

▪       It was also argued that the criteria of art2(a) and (b) are ‘objective’ in nature. The intent of the ‘controller’ is not relevant. Hence SE’s are data controllers as they are de facto processing personal data.

▪       Google – allegedly – also has a separate responsibility because it makes information much more widely available, it can provide a complete picture of an individual and it has its own specific (commercial) purposes. Google does not “control” the initial uploading of content, but it does control its aggregation and subsequent dissemination. Responsibility of SE’s is distinct and separate.

(3) Right to be Forgotten/Erasure v-a-v Search Engines

▪       It is stressed that Google is not asked to conduct a priori monitoring of all the content it indexes. The plaintiff (and its allies) rather advocate for a specific notice and takedown regime, similar to copyright claims. Only when a specific complaint is made, regarding a specific piece of content, Google should remove the search results.

▪       In order to invoke such a right, the data subject should at least demonstrate a violation of the legitimate processing requirement (art.7 DPD) or the principle of data quality (art.6 DPD).

▪       On the risk to FoI and FoE: When there is a conflict between different rights, a balance should be made. Neither one should automatically prevail.

After this first round of arguments, the court asked several questions to all parties involved. Most of them related to the practical implications of a potential obligation on Google to remove search results.

The Advocate General will finish his Opinion by June 25th, and a final judgement should follow soon thereafter.

Together with some colleagues at ICRI, I will soon publish a working paper making a more thorough legal analysis of all the issues at stake in this particular case. To be Continued…

EU Court Of Justice Says Software Functionality Is Not Subject To Copyright

In a relatively short ruling, the court points out that, while software itself may be covered by copyright, “the ideas and principles which underlie any element of a computer program, including those which underlie its interfaces, are not protected by copyright.” Basically, the EUCJ properly recognized the difference between protecting the idea not copyrightable and the expression copyrightable. The court points out that actual code can still be covered, but the features generated out of that code is a different story:

via Techdirt.

ECJ rules in favor of Net Freedom and against censorship

Judgment in Case C-360/10 Belgische Vereniging van Auteurs, Componisten en Uitgevers (SABAM) v Netlog NV: 

The owner of an online social network cannot be obliged to install a general filtering system, covering all its users, in order to prevent the unlawful use of musical and audio-visual work

Such an obligation would not be respecting the prohibition to impose on that provider a general obligation to monitor nor the requirement that a fair balance be struck between the protection of copyright, on the one hand, and the freedom to conduct business, the right to protection of personal data and the freedom to receive or impart information, on the other.