CJEU is asked to rule on the ‘Right to be Forgotten’ again

The Italian Supreme Court recently asked the CJEU for a preliminary ruling on two questions regarding the ‘right to be forgotten’.

[Disclaimer: this information is loosely translated from official documents published by the Dutch Ministry of Foreign Affairs. The original request can be found here [it]. The CJEU’s documents folder (still empty at the time of writing) can be found here.]

Facts & Procedure (Case C-398/15 – Manni)

The original plaintiff (Salvatore Manni)’s business had gone bankrupt in 1992. This was added to a public Company Register, managed by the defendant (Camera di commercio di Lecce). Plaintiff argued he (his business of selling houses in particular) suffers damages and requested defendant to anonymise his name or restrict access to the register. Defendant stated that the ‘Companies Register’ is a public database with the primary function of informing (on request) about relevant information of companies. The case escalated all the way to the Italian Supreme Court (Corte Suprema di Cassazione), which referred to questions to the CJEU.

Questions referred

The Italian Court essentially wonders whether information legally consigned to (and made public by) the defendant, can be erased, anonymised or access-restricted after a certain time. The Court does point out the importance of the public Register (for legal certainty). Referring to the Google Spain Case (C-131/12), the Court asks not whether the information should be erased from the Register, but whether limits should be put as to the (further) use of this public information by third parties.

  1. Does Article 6(1)(e) of the Data Protection Directive supersede the making public through the company register as commended by Directive 68/151/EEG and corresponding national legislation, to the extent that the latter requires that anyone should have have access to the personal data in the register without restrictions?
  2. Does Article 3 of Directive 68/151/EEG allow, in contrast with the rule that the Company Register saves public information for an indeterminate time and can be consulted by anyone, the information to be made no longer ‘public’, though still available to a specific group, and this to be decided on a case-by-case basis by the Register’s manager?


The underlying facts in this ‘Manni’ Case, are strikingly similar to the ones in the Google Spain Case. Instead of focusing on the third-party, however, the CJEU is now asked to evaluate the obligations of the original publisher. In Google Spain, it was already decided (by the national DPA) that the original publication could not be touched before even reaching the CJEU. In the Manni Case, the original source also has a legal obligation to publish. Yet, it is not asked to remove personal data from the source altogether. Only whether the source can be asked to make it less accessible. This raises very interesting questions – left unanswered in Google Spain – as to the obligations on the shoulders of the original publishers and different degrees of publicity.

To be continued…!

R2E – R2O – RtbF: What’s in a Name?

– Excerpt from Draft Article –

The Google Spain case has definitely added fuel to the fire regarding the so-called ‘Right to be Forgotten’ debate. Much of the discourse, however, has mixed up several related concepts. Hence, it is important (once again) to distinguish and clarify three important notions: the ‘right to be forgotten’, the ‘right to object’ and the ‘right to erasure’. First of all, the so-called ‘right to be forgotten’ is not mentioned in the Directive (and it has been removed from the latest DPR draft as well), but is rather used as a catchphrase in the rhetoric of different sides in the debate (pro and contra). In fact, the ‘right to be forgotten’ can be described as an umbrella term, aiming to encapsulate different important rights. Its very name already causes confusion by suggesting an obligation on third parties to ‘forget’. Instead, it is a rather clumsy translation into law of a broader policy goal. In this regard, it can be traced back to the French droit a l’oubli. This ‘right’ has traditionally been applied in situations where an individual was confronted with publicity of his personal life in the media in a disproportionate or unreasonable way (e.g. an ex-convict who sees new articles appearing decades after the facts). Nevertheless, it has no dedicated legal ground and is usually invoked on the basis of a variety of legal frameworks (e.g. right of personal portrayal, defamation, general right to privacy, etc.). Given the potential conflict with the right to freedom of expression, the right has only been applied sporadically by Courts. In any situation, this right implies the presence of an (potential) imminent harm that can only be prevented by removing the information or at least preventing its (further) publicity.

The right to object and right to erasure can be found in the data protection framework. The rationale of these rights is not as much to prevent/withdraw the publication of one’s personal data, but rather to empower data subjects. Instead of rights to ‘stop me from speaking about you’, they are intended as a check on how personal data is used and allow individuals to control the use of their personal data over time. The right to object (article 14) can be invoked on the basis of compelling and legitimate grounds, relating to one’s particular situation. It should be emphasized, however, that this right only relates to a specific processing activity. When successfully exercised, the controller will not be allowed to process the personal data for the purposes objected to anymore. The same personal data might still be processed for different purposes for as long as these other activities comply with the data protection framework. A social network, for example, will not be able to process my personal data for direct marketing anymore, but can still use it for other purposes (e.g. statistics, personalisation, etc.). The right to erasure (art. 12(b)) on the other hand, addresses the personal data itself. It can be invoked whenever the controller does not comply with the Directive, in particular because of the incomplete or inaccurate nature of the data. In other words, when the data subject can demonstrate the controller has violated any of its legal obligations under Directive 95/46, it can – depending on the facts – obtain the removal of his/her personal data. If successful, the data cannot be used for any other purpose.

To summarise, whereas the right to object relates to a specific processing activity, the right to erasure relates to the data itself. The ‘right to be forgotten’ constitutes a catchy – though deceptive – policy goal.


Court of Justice Finally issues Judgment in Google Spain Case (C-131/12)

*This BlogPost is based on a piece written for the LSE Media Policy Blog and Internet Policy Review*


The Court of Justice of the European Union (CJEU) finally released its long-awaited judgment in the Google Spain (C-131/12) case. In short, the Court decided that individuals do have a right to request search engines to remove links to webpages when the individual’s name is used as a search query. This ruling cannot be overturned and is now referred back to the national court. Theoretically, it is still possible for Google to take this case to the European Court of Human Rights (based on article 10 ECHR) once the national Court makes a final decision.

Although the Case is often referred to as the Right to be Forgotten Case, it does not hinge upon the similarly named provision in the proposed Data Protection Regulation. Instead, the main legal basis in this decision was the Data Protection Directive 95/46 (hereafter: ‘the Directive’), including the rights to object (art.14) and to erasure (12(b)). The case is particularly interesting because it lies at the intersection of data protection law, freedom of expression and (a detailed discussion on this interaction is available here).


The facts of the case concerned a Spanish citizen who was subject to bankruptcy proceedings in the nineties. Spanish law dictated that links to the public auction following this bankruptcy were published in a local newspaper (LaVanguardia). In the late 2000s, the citizen discovers that links to this newspaper article appear as the top results when entering his name into Google’s search engine. All requests directed to the newspaper to takedown – or at least anonymise – the respective article were unsuccessful. After all, it had the legal obligation to publish the information in the first place. The Spanish data protection authority did rule, however, that Google should take down links to the article when entering the individual’s name. The search engine appealed to this decision and the case was brought before the Audiencia Nacional, which in turn referred three questions to the Court of Justice of the EU for a preliminary ruling.

Last June, Advocate General Jääskinen already issued an Opinion in this case, which in turn sparked a lot of academic debate. In this Opinion, the AG concluded that data subjects do not have a right to erasure vis-à-vis search engines with regard to information, published legally on third parties’ web pages (§.138 of the Opinion).



Given the complexity of the case and the nuanced wording of the decision, it will take many readings to form a more definitive opinion about this ruling. However, here are my first thoughts.

The CJEU was asked to answer three main questions, relating to (1) the territorial scope of the Directive; (2) the material and personal scope of the Directive; and (3) whether or not data subjects have a right to object/erasure when it comes to search engines directly.

Scope of Application

With regard to the first two questions, the Court was rather straight-forward. To the extent that ‘the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State’, the processing falls within the territorial scope of application of the Directive (art.4(1)a) (§.60).

Given the fact that search engines ‘collect’, ‘retrieve’, ‘record’, ‘organize’, ‘store’ and ‘make available’, they do process personal data, and thus fall within the material scope of application of the Directive (art.2(b)) (§.28-29).

The Court also specified that search engines’ activities can be distinguished from (and are additional to) those carried out by the original publisher(s). Hence, they should be considered controllers (art.2(d)) (§.41).


Right to be Forgotten?

The third category of questions that was presented to the CJEU, related to the so-called right to be forgotten and constitutes the most controversial aspect in this case. Some of the key issues are:

  • Limited scope of the judgment

First of all it is important not to overemphasise the impact of this judgment on the right to freedom of expression (art. 11 Charter; art.10 ECHR). In this particular case, the request related specifically to the link between using an individual’s name as a search query and the search result referring to a particular webpage. In other words, even if the request is granted, the same webpage can still be reached through other – maybe more relevant – search terms.

  • No obligation to delete, but an obligation to balance

One should not conclude that any individual can now request search engines to delete links to webpages when their name is used as a search term. Instead, such requests will still have to comply with the requirements under article 12(b) (right to erasure) and/or article 14 (right to object). Put briefly, these provisions require a balance to be made between opposing rights and interests (§.74; 76). Hence, the plaintiff will have to substantiate his/her request and upon receiving such a request, the search engine will have to make the necessary balance. If the search engine does not grant the request, the CJEU specified that ‘the data subject can bring the matter before the supervisory or judicial authority so that it carries out the necessary checks’ (§.77). In other words, search engines are not obliged to comply with takedown requests, unless a supervisory or judicial authority issues them.

  • Independent responsibility of Search Engines

This observation ties back to the personal scope of the Directive. It was emphasised throughout the judgement that Google’s activities can clearly be distinguished from those of the original publishers. The potential harm or negative consequences vis-à-vis the data subject will in many cases not result from an obscure publication in a local online newspaper, but rather from the widespread (and often decontextualised) availability of the information through search engines. A logical consequence is that even though the original content is published lawfully, data subjects will still be able to request the removal from search engines directly. It is important to distinguish this from potential requests directed to the original publisher (e.g. to remove or blur out his/her personal data) (§. 39).

  • Over-responsibilisation?

Upon first reading, one could claim the judgment puts to big a burden on search engines. After all, paragraph 38 specifically states that the operator of a search must comply with all the requirements in the Directive. It goes without saying that subjecting search engines to the full application of the data protection Directive, gives rise to considerable concerns. On the other hand, the judgment does specify that search engines only need to comply with the Directive ‘within the framework of their responsibilities, powers and capabilities’ (§.38; 83). It is still too early, however, to predict how this will play out in practice.

  • Presumption that data subject’s rights trump all others

One of the most important concerns I have at this stage, concerns the Court’s presumption that ‘data subject’s rights […] override, as a general rule, the interest of internet users…’ as well as the economic interests of the search engine operator itself (§.81). In other words, it seems that the court suggests an imbalance of interests should be presumed, favouring privacy interests over all others. However, the Court does seem to nuance this by stating the balance might depend on the nature of the information, its sensitivity, the interest of the public, the role of the relevant individual in public life, etc. Needless to say that this wording is not conducive to legal certainty.


Today’s ruling by the Court of Justice in Google Spain undoubtedly raised many eyebrows. Surprisingly it almost entirely goes against the Opinion of the Advocate General issued in June 2013. Nevertheless, it is still too early to draw general conclusions from the judgement. Even though at first glance it seems to considerably threaten freedom of expression/information interests, much of the wording seems to be very nuanced and limited in scope when looked at more closely. Additionally, the decision is entirely based on the existing legal framework (Directive 95/46). It is hard to predict how the judgment will interact with the future data protection Regulation, which is already being drafted..

Hosting Platforms after the Italian GoogleVideo Case – Data Controllers or not?

In its long awaited judgement, the Italian Supreme Court ruled that Google Video could be not be deemed a data controller with regard to the videos it hosts on its platform. As a result, they cannot be held responsible for the dissemination of these videos. The Court specified that the rules ‘presuppose actual decision-making power over (a) the purposes and means of the relevant processing (dissemination to the public); and (b) the balancing between different rights and interests at stake. It can be deduced from the existing framework that this decision making power depends on the existence of actual knowledge. In other words, Google Video only becomes responsible (data controller) from the moment it is made aware. This interpretation, the Court explained, is in line with what is written down in the eCommerce Directive (exemption of hosting providers and no general obligation to monitor, artt.14-15).

It is worth saying, however, that many processing activities are relevant in this context. The dissemination of the video (containing personal data) is one processing activity, for which the uploader should be considered controller. But, besides this, the video (and hence the personal data contained within) is potentially subject to many other processing activities as well (analysis for behavioural marketing purposes, facial recognition, etc.). With regard to this second strand of uses of the data, a strong argument can be made for the hosting platform to be the data controller. After all, they are determining the purpose and means of these specific activities.

Because in the case at hand, it was mainly the activity of dissemination that was objected to, the original controller bears primary responsibility. But this should not overshadow the responsibilities of hosting platforms (and the like) for the plethora of other processing activities the data is subject to.


Intermediary Liability – Automation = Neutrality = Exempted?

Tomorrow, Advocate General Jääskinen is to release his opinion in the much debated Google v Spain (aka Right to be Forgotten) case. According to Google’s adversaries, Search Engines are to be considered data controllers under the Data Protection Directive for the personal data (on the websites) they refer to and are therefore (under certain circumstances) liable to remove links (for more info, see my previous blogpost on this case).

An often invoked counter-argument to liability assertions by Internet intermediaries relates to the automated nature of their processing activities. In other words, Intermediaries often argue that they are merely providing a neutral service, content-agnostic and fully automated. After all, it is claimed, decisions are made by algorithms and no human eyes actually ‘see’ the information. In 2010 the CJEU seems to have acknowledged such an interpretation, stating that services that only perform activities ‘of a mere technical, automatic and passive nature … having neither knowledge of nor control over the information which is transmitter or stored’ should be exempted (Louis Vuitton v Google, C-236/08). In the UK, Justice Eady has also ruled that automation precludes intentionality (In this case, Google was not held liable for defamatory snippets it displayed in its search results.). The argument that automation equals neutrality, however, seems to be falling apart. Being a mere (automated) ‘organizing agent’ does not necessarily entail neutrality, nor does it necessarily validate the exemption of liability. After all, as U. Kohl aptly describes: “both organization and automation require human judgment and thus have assumptions, values and goals embedded into them.” Put differently, the use of algorithms does not imply neutrality. Instead of looking at the automated nature of an intermediary service provider, one should look at how it is designed. Such interpretation was also followed by AG Jääskinen in L’Oreal v eBay. In his Opinion, he clearly stated that neutrality does not even constitute the right test to decide on the exemption from liability. Instead, one should look at the type of activities of the relevant service provider. The liability of a search engine, for example, may depend on the fact whether it simply refers to information, displays snippets or autosuggests the information. Particularly with regard to defamatory content, a considerable amount of – widely diverging – cases has emerged over the last few years. A number of French Courts have ruled search engines to be liable for auto-suggesting information. Very recently, the German Federal Court of Justice also overturned two decisions, requiring Google to block defamatory auto-complete search suggestions. In a 2011 Italian judgment, the Court followed the plaintiff’s claim that Google should be considered to have ‘created’ the auto-complete terms. Even if not considered to actually create the actual terms, search engines still make a deliberate decision to adopt autocomplete functionality and design the underlying algorithms. Moreover, it is particularly hard to rely on the neutrality argument (based on automation and alleged content-agnosticism) after the intermediary has received a notification (infra).

Put briefly, we see a crumbling of the ‘automation=neutrality=exemption-from-liability’ argument. Particularly taking into account AG Jääskinen’s previous statements in L’Oreal v Google (specifically expressing concern on the argument that “in order to establish whether the liability of a paid internet referencing service provider may be limited under Article 14 of Directive 2000/31, it is necessary to examine whether the role played by that service provider is neutral in the sense that its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores”), it is most likely that he will come to a similar conclusion in his Opinion in the Google v Spain Case which will – normally – be released tomorrow.

Google v Spain at the Court of Justice of the EU

One week ago, the so-called ‘Google v Spain’ or ‘Right to be Forgotten’ case was heard before the Court of Justice of the EU (C-131/12).

Put briefly, Google was ordered by a Spanish Court to remove certain search results – relating to Mr. Carlos José – from its index. The contentious search results linked back to Mr. José’s insolvency proceedings, published in a newspaper in 1998. Google appealed and the Audiencia Nacional referred to the ECJ, lodging for a preliminary ruling on the three main questions in this case: (1) Territoriality: do the contentious facts fall within the territorial scope of application (art.4) of the Data Protection Directive (DPD)? (2) Can Google – as a Search Engine (SE) – be considered a (personal) data controller? (3) What is the extent of the ‘Right to be Forgotten’ aka ‘right to erasure and blocking of data’? Currently there are over 200 similar cases pending before Spanish Courts (and not unlikely in other EU jurisdictions as well).



(1) Territoriality

▪       The contentious activity at stake here is not ‘carried out in the context of’ the activities of Google Spain – being an EU establishment of Google Inc. (Article 4(1)(a) DPD). Google Spain is not involved in the SE activities itself (Google Inc is). Its only relevant activities with regard to the SE is to provide advertising space and their behavioural advertisement model is not based on the indexed content.

▪       Art.4(1)(c) is not applicable either, as the mere provision of a service in an EU Member State (even if Spanish domain name is used) cannot be considered ‘use of equipment’ within the meaning of the DPD. The use of web spiders to index content should not be considered ‘use of equipment’ either. The use of cookies would constitute ‘use of equipment’ but is not relevant in this case.

(2) Google as Controller

▪       Google collects, processes and indexes data indiscriminately. They are ignorant about whether or not the content of the webpages that is being indexed contains personal data. There is an obvious lack of intent that distinguishes this case with the Lindqvist and Satamedia cases.

▪       The decision to take content down should be taken by the party that is best placed to do so. Given the fact that Google does not control any of the data held on the websites it indexes, nor that it has any intent to do so, the publisher of the original webpage is best placed to decide.

▪       Even if one would consider Google to be processing personal data, the company still argues not to be a data controller because: (a) there is no intent to process personal data; (b) Google does not verify whether indexed data is personal or not; (c) the publisher has final and continuing control; (d) if the publisher removes the (personal) data, Google does so as well; (e) Google cannot verify the legitimacy personal data processing; (f) Google only plays a secondary/accessory role in disseminating the information; (g) articles 12-15 eCommerce Directive; (g) Article 29 Working Party’s Opinion 1/2008 (p13-14) endorses SE’s role as intermediaries in situations such as the one at hand; (h) Google’s role can be compared to that of a Telecom operator who are stricto sensu  also processing personal data, but or not liable under the DPD either. Mere intermediaries transferring data.

(3) Right to be Forgotten/Erasure v-a-v Search Engines

▪       On the question whether a SE can be asked to remove links directly, without the data subject first having to go to the original publisher first, Google raises the Freedom of Expression (FoE) and Freedom of Information (FoI) flag: (1) the original publishers will be deprived of an important channel of communication and (2) Internet users in general will have access to less information. The responsibility of publishers should not be shifted onto the shoulders of intermediaries such as search engines. This would also violate the proportionality principle, Google argues: (a) Obliging Google to erase search results does not prevent the information form appearing elsewhere; (b) Google cannot assess the legality; (c) Google can only remove the link to the webpage entirely (it cannot just anonymise certain bits in the webpage), which would constitute overkill as the webpages will usually contain much more information than just the contentious personal data.

Plaintiff (+ allies: Spanish & Austrian Government, European Commission)

(1) Territoriality

▪       For the DPD not to apply to the issue at stake, the question is whether or not Google Spain’s activities can be sufficiently distinguished from those of Google Inc. This is clearly not the case, according to the plaintiff (and its allies). Google Spain’s activity is not merely ancillary, but constitutes an integral part of Google Inc’s activities. They are just doing it for a particular jurisdiction.

 (2) Google as Controller

▪       DPD was written before the widespread use of the Internet and SE’s in particular. The DPD should, therefore, be applied creatively to Google. According to the plaintiff, Google is a data controller as it actively collects and processes personal data (referring to art.2(b) DPD ‘dissemination or otherwise making available’ and Lindqvist: any processing of data – even if data is published already – constitutes PD processing within scope of DPD). Its activity constitutes a separate ‘publication’ from the original one.

▪       Google can even be considered a data controller v-a-v the content of the webpages it indexes, because: (a) it determines the means (algorithms, web spiders, etc.) and purpose (include information in search results) of processing; (b) Google actively directs and controls the flow of information and its actions cannot be compared to bidirectional traffic management of a telecom operator. In other words, they are not ‘neutral’ intermediaries. Google provides an added value service, which it cannot provide without acting autonomously.

▪       It was also argued that the criteria of art2(a) and (b) are ‘objective’ in nature. The intent of the ‘controller’ is not relevant. Hence SE’s are data controllers as they are de facto processing personal data.

▪       Google – allegedly – also has a separate responsibility because it makes information much more widely available, it can provide a complete picture of an individual and it has its own specific (commercial) purposes. Google does not “control” the initial uploading of content, but it does control its aggregation and subsequent dissemination. Responsibility of SE’s is distinct and separate.

(3) Right to be Forgotten/Erasure v-a-v Search Engines

▪       It is stressed that Google is not asked to conduct a priori monitoring of all the content it indexes. The plaintiff (and its allies) rather advocate for a specific notice and takedown regime, similar to copyright claims. Only when a specific complaint is made, regarding a specific piece of content, Google should remove the search results.

▪       In order to invoke such a right, the data subject should at least demonstrate a violation of the legitimate processing requirement (art.7 DPD) or the principle of data quality (art.6 DPD).

▪       On the risk to FoI and FoE: When there is a conflict between different rights, a balance should be made. Neither one should automatically prevail.

After this first round of arguments, the court asked several questions to all parties involved. Most of them related to the practical implications of a potential obligation on Google to remove search results.

The Advocate General will finish his Opinion by June 25th, and a final judgement should follow soon thereafter.

Together with some colleagues at ICRI, I will soon publish a working paper making a more thorough legal analysis of all the issues at stake in this particular case. To be Continued…