From the Maw-Law Blog: “On December 12, 2015, Brahim Zaibat, a dancer and choreographer, posted on social media a selfie he had taken two years ago, showing him in an airplane, just a seat behind the one where Jean-Marie Le Pen, the honorary president of the French National Front, had fallen asleep. …” Read more at: http://www.maw-law.com/uncategorized/selfie-privacy-freedom-speech-collide-france/
The Court of Justice of the European Union (CJEU) finally released its long-awaited judgment in the Google Spain (C-131/12) case. In short, the Court decided that individuals do have a right to request search engines to remove links to webpages when the individual’s name is used as a search query. This ruling cannot be overturned and is now referred back to the national court. Theoretically, it is still possible for Google to take this case to the European Court of Human Rights (based on article 10 ECHR) once the national Court makes a final decision.
Although the Case is often referred to as the Right to be Forgotten Case, it does not hinge upon the similarly named provision in the proposed Data Protection Regulation. Instead, the main legal basis in this decision was the Data Protection Directive 95/46 (hereafter: ‘the Directive’), including the rights to object (art.14) and to erasure (12(b)). The case is particularly interesting because it lies at the intersection of data protection law, freedom of expression and (a detailed discussion on this interaction is available here).
The facts of the case concerned a Spanish citizen who was subject to bankruptcy proceedings in the nineties. Spanish law dictated that links to the public auction following this bankruptcy were published in a local newspaper (LaVanguardia). In the late 2000s, the citizen discovers that links to this newspaper article appear as the top results when entering his name into Google’s search engine. All requests directed to the newspaper to takedown – or at least anonymise – the respective article were unsuccessful. After all, it had the legal obligation to publish the information in the first place. The Spanish data protection authority did rule, however, that Google should take down links to the article when entering the individual’s name. The search engine appealed to this decision and the case was brought before the Audiencia Nacional, which in turn referred three questions to the Court of Justice of the EU for a preliminary ruling.
Last June, Advocate General Jääskinen already issued an Opinion in this case, which in turn sparked a lot of academic debate. In this Opinion, the AG concluded that data subjects do not have a right to erasure vis-à-vis search engines with regard to information, published legally on third parties’ web pages (§.138 of the Opinion).
Given the complexity of the case and the nuanced wording of the decision, it will take many readings to form a more definitive opinion about this ruling. However, here are my first thoughts.
The CJEU was asked to answer three main questions, relating to (1) the territorial scope of the Directive; (2) the material and personal scope of the Directive; and (3) whether or not data subjects have a right to object/erasure when it comes to search engines directly.
Scope of Application
With regard to the first two questions, the Court was rather straight-forward. To the extent that ‘the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State’, the processing falls within the territorial scope of application of the Directive (art.4(1)a) (§.60).
Given the fact that search engines ‘collect’, ‘retrieve’, ‘record’, ‘organize’, ‘store’ and ‘make available’, they do process personal data, and thus fall within the material scope of application of the Directive (art.2(b)) (§.28-29).
The Court also specified that search engines’ activities can be distinguished from (and are additional to) those carried out by the original publisher(s). Hence, they should be considered controllers (art.2(d)) (§.41).
Right to be Forgotten?
The third category of questions that was presented to the CJEU, related to the so-called right to be forgotten and constitutes the most controversial aspect in this case. Some of the key issues are:
- Limited scope of the judgment
First of all it is important not to overemphasise the impact of this judgment on the right to freedom of expression (art. 11 Charter; art.10 ECHR). In this particular case, the request related specifically to the link between using an individual’s name as a search query and the search result referring to a particular webpage. In other words, even if the request is granted, the same webpage can still be reached through other – maybe more relevant – search terms.
- No obligation to delete, but an obligation to balance
One should not conclude that any individual can now request search engines to delete links to webpages when their name is used as a search term. Instead, such requests will still have to comply with the requirements under article 12(b) (right to erasure) and/or article 14 (right to object). Put briefly, these provisions require a balance to be made between opposing rights and interests (§.74; 76). Hence, the plaintiff will have to substantiate his/her request and upon receiving such a request, the search engine will have to make the necessary balance. If the search engine does not grant the request, the CJEU specified that ‘the data subject can bring the matter before the supervisory or judicial authority so that it carries out the necessary checks’ (§.77). In other words, search engines are not obliged to comply with takedown requests, unless a supervisory or judicial authority issues them.
- Independent responsibility of Search Engines
This observation ties back to the personal scope of the Directive. It was emphasised throughout the judgement that Google’s activities can clearly be distinguished from those of the original publishers. The potential harm or negative consequences vis-à-vis the data subject will in many cases not result from an obscure publication in a local online newspaper, but rather from the widespread (and often decontextualised) availability of the information through search engines. A logical consequence is that even though the original content is published lawfully, data subjects will still be able to request the removal from search engines directly. It is important to distinguish this from potential requests directed to the original publisher (e.g. to remove or blur out his/her personal data) (§. 39).
Upon first reading, one could claim the judgment puts to big a burden on search engines. After all, paragraph 38 specifically states that the operator of a search must comply with all the requirements in the Directive. It goes without saying that subjecting search engines to the full application of the data protection Directive, gives rise to considerable concerns. On the other hand, the judgment does specify that search engines only need to comply with the Directive ‘within the framework of their responsibilities, powers and capabilities’ (§.38; 83). It is still too early, however, to predict how this will play out in practice.
- Presumption that data subject’s rights trump all others
One of the most important concerns I have at this stage, concerns the Court’s presumption that ‘data subject’s rights […] override, as a general rule, the interest of internet users…’ as well as the economic interests of the search engine operator itself (§.81). In other words, it seems that the court suggests an imbalance of interests should be presumed, favouring privacy interests over all others. However, the Court does seem to nuance this by stating the balance might depend on the nature of the information, its sensitivity, the interest of the public, the role of the relevant individual in public life, etc. Needless to say that this wording is not conducive to legal certainty.
Today’s ruling by the Court of Justice in Google Spain undoubtedly raised many eyebrows. Surprisingly it almost entirely goes against the Opinion of the Advocate General issued in June 2013. Nevertheless, it is still too early to draw general conclusions from the judgement. Even though at first glance it seems to considerably threaten freedom of expression/information interests, much of the wording seems to be very nuanced and limited in scope when looked at more closely. Additionally, the decision is entirely based on the existing legal framework (Directive 95/46). It is hard to predict how the judgment will interact with the future data protection Regulation, which is already being drafted..
Interesting thesis by recent MIT graduate.
“[…] This thesis investigates user-generated censorship: an emergent mode of intervention by which users strategically manipulate social media to suppress speech. It shows that the tools designed to help make information more available have been repurposed and reversed to make it less available. Case studies reveal that these platforms, far from being neutral pipes through which information merely travels, are in fact contingent sociotechnical systems upon and through which users effect their politics through the power of algorithms. […]“
Tomorrow, Advocate General Jääskinen is to release his opinion in the much debated Google v Spain (aka Right to be Forgotten) case. According to Google’s adversaries, Search Engines are to be considered data controllers under the Data Protection Directive for the personal data (on the websites) they refer to and are therefore (under certain circumstances) liable to remove links (for more info, see my previous blogpost on this case).
An often invoked counter-argument to liability assertions by Internet intermediaries relates to the automated nature of their processing activities. In other words, Intermediaries often argue that they are merely providing a neutral service, content-agnostic and fully automated. After all, it is claimed, decisions are made by algorithms and no human eyes actually ‘see’ the information. In 2010 the CJEU seems to have acknowledged such an interpretation, stating that services that only perform activities ‘of a mere technical, automatic and passive nature … having neither knowledge of nor control over the information which is transmitter or stored’ should be exempted (Louis Vuitton v Google, C-236/08). In the UK, Justice Eady has also ruled that automation precludes intentionality (In this case, Google was not held liable for defamatory snippets it displayed in its search results.). The argument that automation equals neutrality, however, seems to be falling apart. Being a mere (automated) ‘organizing agent’ does not necessarily entail neutrality, nor does it necessarily validate the exemption of liability. After all, as U. Kohl aptly describes: “both organization and automation require human judgment and thus have assumptions, values and goals embedded into them.” Put differently, the use of algorithms does not imply neutrality. Instead of looking at the automated nature of an intermediary service provider, one should look at how it is designed. Such interpretation was also followed by AG Jääskinen in L’Oreal v eBay. In his Opinion, he clearly stated that neutrality does not even constitute the right test to decide on the exemption from liability. Instead, one should look at the type of activities of the relevant service provider. The liability of a search engine, for example, may depend on the fact whether it simply refers to information, displays snippets or autosuggests the information. Particularly with regard to defamatory content, a considerable amount of – widely diverging – cases has emerged over the last few years. A number of French Courts have ruled search engines to be liable for auto-suggesting information. Very recently, the German Federal Court of Justice also overturned two decisions, requiring Google to block defamatory auto-complete search suggestions. In a 2011 Italian judgment, the Court followed the plaintiff’s claim that Google should be considered to have ‘created’ the auto-complete terms. Even if not considered to actually create the actual terms, search engines still make a deliberate decision to adopt autocomplete functionality and design the underlying algorithms. Moreover, it is particularly hard to rely on the neutrality argument (based on automation and alleged content-agnosticism) after the intermediary has received a notification (infra).
Put briefly, we see a crumbling of the ‘automation=neutrality=exemption-from-liability’ argument. Particularly taking into account AG Jääskinen’s previous statements in L’Oreal v Google (specifically expressing concern on the argument that “in order to establish whether the liability of a paid internet referencing service provider may be limited under Article 14 of Directive 2000/31, it is necessary to examine whether the role played by that service provider is neutral in the sense that its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores”), it is most likely that he will come to a similar conclusion in his Opinion in the Google v Spain Case which will – normally – be released tomorrow.
“SB 568 prohibits websites from marketing a product or service to a minor, if the minor cannot legally purchase the product or participate in the service in the State of California. This prohibition applies to all sites and apps “directed to minors” or if the operators “has actual knowledge that a minor is using its” service. […]
California SB 501, meanwhile, would require websites to remove personally identifiable information about minors upon the request of the minor OR the parent within 96 hours of the request.”
One week ago, the so-called ‘Google v Spain’ or ‘Right to be Forgotten’ case was heard before the Court of Justice of the EU (C-131/12).
Put briefly, Google was ordered by a Spanish Court to remove certain search results – relating to Mr. Carlos José – from its index. The contentious search results linked back to Mr. José’s insolvency proceedings, published in a newspaper in 1998. Google appealed and the Audiencia Nacional referred to the ECJ, lodging for a preliminary ruling on the three main questions in this case: (1) Territoriality: do the contentious facts fall within the territorial scope of application (art.4) of the Data Protection Directive (DPD)? (2) Can Google – as a Search Engine (SE) – be considered a (personal) data controller? (3) What is the extent of the ‘Right to be Forgotten’ aka ‘right to erasure and blocking of data’? Currently there are over 200 similar cases pending before Spanish Courts (and not unlikely in other EU jurisdictions as well).
SUMMARY OF MAIN ARGUMENTS:
▪ The contentious activity at stake here is not ‘carried out in the context of’ the activities of Google Spain – being an EU establishment of Google Inc. (Article 4(1)(a) DPD). Google Spain is not involved in the SE activities itself (Google Inc is). Its only relevant activities with regard to the SE is to provide advertising space and their behavioural advertisement model is not based on the indexed content.
(2) Google as Controller
▪ Google collects, processes and indexes data indiscriminately. They are ignorant about whether or not the content of the webpages that is being indexed contains personal data. There is an obvious lack of intent that distinguishes this case with the Lindqvist and Satamedia cases.
▪ The decision to take content down should be taken by the party that is best placed to do so. Given the fact that Google does not control any of the data held on the websites it indexes, nor that it has any intent to do so, the publisher of the original webpage is best placed to decide.
▪ Even if one would consider Google to be processing personal data, the company still argues not to be a data controller because: (a) there is no intent to process personal data; (b) Google does not verify whether indexed data is personal or not; (c) the publisher has final and continuing control; (d) if the publisher removes the (personal) data, Google does so as well; (e) Google cannot verify the legitimacy personal data processing; (f) Google only plays a secondary/accessory role in disseminating the information; (g) articles 12-15 eCommerce Directive; (g) Article 29 Working Party’s Opinion 1/2008 (p13-14) endorses SE’s role as intermediaries in situations such as the one at hand; (h) Google’s role can be compared to that of a Telecom operator who are stricto sensu also processing personal data, but or not liable under the DPD either. Mere intermediaries transferring data.
(3) Right to be Forgotten/Erasure v-a-v Search Engines
▪ On the question whether a SE can be asked to remove links directly, without the data subject first having to go to the original publisher first, Google raises the Freedom of Expression (FoE) and Freedom of Information (FoI) flag: (1) the original publishers will be deprived of an important channel of communication and (2) Internet users in general will have access to less information. The responsibility of publishers should not be shifted onto the shoulders of intermediaries such as search engines. This would also violate the proportionality principle, Google argues: (a) Obliging Google to erase search results does not prevent the information form appearing elsewhere; (b) Google cannot assess the legality; (c) Google can only remove the link to the webpage entirely (it cannot just anonymise certain bits in the webpage), which would constitute overkill as the webpages will usually contain much more information than just the contentious personal data.
Plaintiff (+ allies: Spanish & Austrian Government, European Commission)
▪ For the DPD not to apply to the issue at stake, the question is whether or not Google Spain’s activities can be sufficiently distinguished from those of Google Inc. This is clearly not the case, according to the plaintiff (and its allies). Google Spain’s activity is not merely ancillary, but constitutes an integral part of Google Inc’s activities. They are just doing it for a particular jurisdiction.
(2) Google as Controller
▪ DPD was written before the widespread use of the Internet and SE’s in particular. The DPD should, therefore, be applied creatively to Google. According to the plaintiff, Google is a data controller as it actively collects and processes personal data (referring to art.2(b) DPD ‘dissemination or otherwise making available’ and Lindqvist: any processing of data – even if data is published already – constitutes PD processing within scope of DPD). Its activity constitutes a separate ‘publication’ from the original one.
▪ Google can even be considered a data controller v-a-v the content of the webpages it indexes, because: (a) it determines the means (algorithms, web spiders, etc.) and purpose (include information in search results) of processing; (b) Google actively directs and controls the flow of information and its actions cannot be compared to bidirectional traffic management of a telecom operator. In other words, they are not ‘neutral’ intermediaries. Google provides an added value service, which it cannot provide without acting autonomously.
▪ It was also argued that the criteria of art2(a) and (b) are ‘objective’ in nature. The intent of the ‘controller’ is not relevant. Hence SE’s are data controllers as they are de facto processing personal data.
▪ Google – allegedly – also has a separate responsibility because it makes information much more widely available, it can provide a complete picture of an individual and it has its own specific (commercial) purposes. Google does not “control” the initial uploading of content, but it does control its aggregation and subsequent dissemination. Responsibility of SE’s is distinct and separate.
(3) Right to be Forgotten/Erasure v-a-v Search Engines
▪ It is stressed that Google is not asked to conduct a priori monitoring of all the content it indexes. The plaintiff (and its allies) rather advocate for a specific notice and takedown regime, similar to copyright claims. Only when a specific complaint is made, regarding a specific piece of content, Google should remove the search results.
▪ On the risk to FoI and FoE: When there is a conflict between different rights, a balance should be made. Neither one should automatically prevail.
After this first round of arguments, the court asked several questions to all parties involved. Most of them related to the practical implications of a potential obligation on Google to remove search results.
The Advocate General will finish his Opinion by June 25th, and a final judgement should follow soon thereafter.
Together with some colleagues at ICRI, I will soon publish a working paper making a more thorough legal analysis of all the issues at stake in this particular case. To be Continued…
“The Iranian government, which presides over one of the most educated and connected populations in the Middle East, is building an Internet all its own. Observers expect it will be fully operational as soon as next year. Iran’s so-called national or halal Internet will be a kind of anti-Internet—a self-contained loop within Iran’s borders featuring only regime-approved Iranian sites, and cut off from the World Wide Web.”