User-Generated Censorship: Manipulating the Maps Of Social Media – MIT Comparative Media Studies/Writing

Interesting thesis by recent MIT graduate.

 

“[…] This thesis investigates user-generated censorship: an emergent mode of intervention by which users strategically manipulate social media to suppress speech. It shows that the tools designed to help make information more available have been repurposed and reversed to make it less available. Case studies reveal that these platforms, far from being neutral pipes through which information merely travels, are in fact contingent sociotechnical systems upon and through which users effect their politics through the power of algorithms. […]

Source: User-Generated Censorship: Manipulating the Maps Of Social Media.

Advertisements

Intermediary Liability – Automation = Neutrality = Exempted?

Tomorrow, Advocate General Jääskinen is to release his opinion in the much debated Google v Spain (aka Right to be Forgotten) case. According to Google’s adversaries, Search Engines are to be considered data controllers under the Data Protection Directive for the personal data (on the websites) they refer to and are therefore (under certain circumstances) liable to remove links (for more info, see my previous blogpost on this case).

An often invoked counter-argument to liability assertions by Internet intermediaries relates to the automated nature of their processing activities. In other words, Intermediaries often argue that they are merely providing a neutral service, content-agnostic and fully automated. After all, it is claimed, decisions are made by algorithms and no human eyes actually ‘see’ the information. In 2010 the CJEU seems to have acknowledged such an interpretation, stating that services that only perform activities ‘of a mere technical, automatic and passive nature … having neither knowledge of nor control over the information which is transmitter or stored’ should be exempted (Louis Vuitton v Google, C-236/08). In the UK, Justice Eady has also ruled that automation precludes intentionality (In this case, Google was not held liable for defamatory snippets it displayed in its search results.). The argument that automation equals neutrality, however, seems to be falling apart. Being a mere (automated) ‘organizing agent’ does not necessarily entail neutrality, nor does it necessarily validate the exemption of liability. After all, as U. Kohl aptly describes: “both organization and automation require human judgment and thus have assumptions, values and goals embedded into them.” Put differently, the use of algorithms does not imply neutrality. Instead of looking at the automated nature of an intermediary service provider, one should look at how it is designed. Such interpretation was also followed by AG Jääskinen in L’Oreal v eBay. In his Opinion, he clearly stated that neutrality does not even constitute the right test to decide on the exemption from liability. Instead, one should look at the type of activities of the relevant service provider. The liability of a search engine, for example, may depend on the fact whether it simply refers to information, displays snippets or autosuggests the information. Particularly with regard to defamatory content, a considerable amount of – widely diverging – cases has emerged over the last few years. A number of French Courts have ruled search engines to be liable for auto-suggesting information. Very recently, the German Federal Court of Justice also overturned two decisions, requiring Google to block defamatory auto-complete search suggestions. In a 2011 Italian judgment, the Court followed the plaintiff’s claim that Google should be considered to have ‘created’ the auto-complete terms. Even if not considered to actually create the actual terms, search engines still make a deliberate decision to adopt autocomplete functionality and design the underlying algorithms. Moreover, it is particularly hard to rely on the neutrality argument (based on automation and alleged content-agnosticism) after the intermediary has received a notification (infra).

Put briefly, we see a crumbling of the ‘automation=neutrality=exemption-from-liability’ argument. Particularly taking into account AG Jääskinen’s previous statements in L’Oreal v Google (specifically expressing concern on the argument that “in order to establish whether the liability of a paid internet referencing service provider may be limited under Article 14 of Directive 2000/31, it is necessary to examine whether the role played by that service provider is neutral in the sense that its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores”), it is most likely that he will come to a similar conclusion in his Opinion in the Google v Spain Case which will – normally – be released tomorrow.

BEREC’s findings on net neutrality

BEREC’s conclusions are that in order to provide net neutrality, competition between operators should rely on effective transparency and the possibility for end-users to easily switch between service providers. National Regulatory Authorities NRAs as well as end-users should also be able to monitor the performance of the Internet access service and of the applications used via that Internet access service.

via EDRi.

The Dutch Adopt Net Neutrality Laws, Lets All Follow Suit

Yesterday, the Netherlands became the first country in Europe to adopt laws that protect net neutrality. The rest of Europe, and indeed most the world, needs to follow suit before we sleepwalk into letting corporations use their deep pockets to gain an unfair advantage online.

via The Dutch Adopt Net Neutrality Laws, Lets All Follow Suit.

India restricts access to online content more subtly than China

The Internet is being censored more subtly in India than it is in China or elsewhere. Although more websites might be blocked in the latter, users at least get the message that access is being restricted. In India, however, people just get an error message. This keeps most Indian Internet users in the dark. Many don’t even realize (to what extent) access to information is being restricted, which in turn blocks a fair and open public debate on the issue… .

See VIDEO on Al Jazeera.