Intermediary Liability – Automation = Neutrality = Exempted?

Tomorrow, Advocate General Jääskinen is to release his opinion in the much debated Google v Spain (aka Right to be Forgotten) case. According to Google’s adversaries, Search Engines are to be considered data controllers under the Data Protection Directive for the personal data (on the websites) they refer to and are therefore (under certain circumstances) liable to remove links (for more info, see my previous blogpost on this case).

An often invoked counter-argument to liability assertions by Internet intermediaries relates to the automated nature of their processing activities. In other words, Intermediaries often argue that they are merely providing a neutral service, content-agnostic and fully automated. After all, it is claimed, decisions are made by algorithms and no human eyes actually ‘see’ the information. In 2010 the CJEU seems to have acknowledged such an interpretation, stating that services that only perform activities ‘of a mere technical, automatic and passive nature … having neither knowledge of nor control over the information which is transmitter or stored’ should be exempted (Louis Vuitton v Google, C-236/08). In the UK, Justice Eady has also ruled that automation precludes intentionality (In this case, Google was not held liable for defamatory snippets it displayed in its search results.). The argument that automation equals neutrality, however, seems to be falling apart. Being a mere (automated) ‘organizing agent’ does not necessarily entail neutrality, nor does it necessarily validate the exemption of liability. After all, as U. Kohl aptly describes: “both organization and automation require human judgment and thus have assumptions, values and goals embedded into them.” Put differently, the use of algorithms does not imply neutrality. Instead of looking at the automated nature of an intermediary service provider, one should look at how it is designed. Such interpretation was also followed by AG Jääskinen in L’Oreal v eBay. In his Opinion, he clearly stated that neutrality does not even constitute the right test to decide on the exemption from liability. Instead, one should look at the type of activities of the relevant service provider. The liability of a search engine, for example, may depend on the fact whether it simply refers to information, displays snippets or autosuggests the information. Particularly with regard to defamatory content, a considerable amount of – widely diverging – cases has emerged over the last few years. A number of French Courts have ruled search engines to be liable for auto-suggesting information. Very recently, the German Federal Court of Justice also overturned two decisions, requiring Google to block defamatory auto-complete search suggestions. In a 2011 Italian judgment, the Court followed the plaintiff’s claim that Google should be considered to have ‘created’ the auto-complete terms. Even if not considered to actually create the actual terms, search engines still make a deliberate decision to adopt autocomplete functionality and design the underlying algorithms. Moreover, it is particularly hard to rely on the neutrality argument (based on automation and alleged content-agnosticism) after the intermediary has received a notification (infra).

Put briefly, we see a crumbling of the ‘automation=neutrality=exemption-from-liability’ argument. Particularly taking into account AG Jääskinen’s previous statements in L’Oreal v Google (specifically expressing concern on the argument that “in order to establish whether the liability of a paid internet referencing service provider may be limited under Article 14 of Directive 2000/31, it is necessary to examine whether the role played by that service provider is neutral in the sense that its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores”), it is most likely that he will come to a similar conclusion in his Opinion in the Google v Spain Case which will – normally – be released tomorrow.

Advertisements

BEREC’s findings on net neutrality

BEREC’s conclusions are that in order to provide net neutrality, competition between operators should rely on effective transparency and the possibility for end-users to easily switch between service providers. National Regulatory Authorities NRAs as well as end-users should also be able to monitor the performance of the Internet access service and of the applications used via that Internet access service.

via EDRi.

EU Court Of Justice Says Software Functionality Is Not Subject To Copyright

In a relatively short ruling, the court points out that, while software itself may be covered by copyright, “the ideas and principles which underlie any element of a computer program, including those which underlie its interfaces, are not protected by copyright.” Basically, the EUCJ properly recognized the difference between protecting the idea not copyrightable and the expression copyrightable. The court points out that actual code can still be covered, but the features generated out of that code is a different story:

via Techdirt.

ECJ rules in favor of Net Freedom and against censorship

Judgment in Case C-360/10 Belgische Vereniging van Auteurs, Componisten en Uitgevers (SABAM) v Netlog NV: 

The owner of an online social network cannot be obliged to install a general filtering system, covering all its users, in order to prevent the unlawful use of musical and audio-visual work

Such an obligation would not be respecting the prohibition to impose on that provider a general obligation to monitor nor the requirement that a fair balance be struck between the protection of copyright, on the one hand, and the freedom to conduct business, the right to protection of personal data and the freedom to receive or impart information, on the other.

Judgement day for German privacy and media freedom cases

Tomorrow, human rights judges will deliver judgements in two complaints which raise important media freedom and privacy issues in Germany.The European Court of Human Rights will deliver two Grand Chamber judgments, in the cases of Axel Springer AG v. Germany application no. 39954/08 and Von Hannover v. Germany no. 2 application nos. 40660/08 and 60641/08, at a public hearing in Strasbourg on Tuesday 7 February at 10h CET.Both cases concern the publication in the media of articles and, in the second case, of photos depicting the private life of well-known people.

Read: Human Rights Europe.

Irish Data Protection Authority on Facebook

  • Facebook’s real-name policy was seen as having “substantial benefits in protecting the people who use Facebook,” and the DPC said the social network had “valid and justified” reasons for prohibiting pseudonyms;
  • Information collected through the use of social plug-ins is not associated with individuals (user or not);
  • Facebook’s tag suggestion tool does not go against DP regulation (though could be more transparent);
  • Advertisement-based business-model is legitimate;
  • It is not possible for third-party developers to repeatedly access personal data;

See: Facebook Gets Passing Grade From Irish Agency Audit.