Balancing in the GDPR: legitimate interests v. right to object

[Note: This post was originally published on the CiTiP Blog]

Balancing exercises permeate data protection law. This post investigates the interaction between two core manifestations of such balancing in the GDPR: the last lawful ground (Art.6(1)f) and the right to object (Art.21(1)).

Balancing in Article 6(1): From lawfulness to guiding principle?

Balancing exercises play a pivotal role in the General Data Protection Regulation (GDPR). They are implied in concepts such as fairness and proportionality that permeate the GDPR. The centrepiece of balancing exercises within the Regulation is to be found in Article 6(1)f. This last lawful ground permits processing of personal data whenever ‘necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject’. In other words, this provision puts the legitimate interests of the controller and those of third parties against the interests, fundamental rights and freedoms of the data subject.

The balancing exercise put forward in Article 6(1)f arguably transcends merely being one lawful ground among the others. Indeed, the previous four grounds (necessary for (b) performance of a contract, (c) compliance with a legal obligation, (d) protect the data subject’s vital interests, (e) tasks carried out in the public interest), could be qualified as variations on a theme. They are simply situations where the interest in processing prevails by default. In light of the fairness principle, one may even claim that the first lawful ground on consent is subject to some level of balancing in light of Article 6(1)f. The very act of consenting can be qualified as a strong indicator that there is a balance. This is particularly true in light of the new Article 7, inter aliaputting forward the unconditional ability to withdraw one’s consent as well as guarding against all-or-nothing clauses.

The practical relevance of this consideration is that the lawfulness of any processing operation, regardless of its lawful ground, will have to be assessed in light of Article 6(1)f’s balancing exercise. The Art. 6(1)f balancing test will inform – not determine – the validity of the other lawful grounds. In sum, the balancing test in Article 6(1)f is the manifestation of the fairness principle (Article 5(1)a) in the form of a lawful ground. As such it can be used as a proxy for evaluating the validity of any of the lawful grounds.

Balancing and the right to object

The balancing exercise in Article 6(1)f can be qualified as an ex ante obligation. It needs to be complied with before the processing operation initiates. In practice, this balancing exercise will unilaterally be defined by the controller. Enter data subject rights. The rights to object and to erasure install ex post rights, effectively empowering data subjects to challenge the balance put forward by the controller. The right to object (Art.21) is particularly interesting in this regard, because it defines its very own balancing exercise. When a data subject objects ‘on grounds relating to his or her particular situation, […] the controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interest, rights and freedoms of the data subject […]’. As such, the right to object may be one of the clearest manifestations of the fairness principle in the shape of a data subject right.

Despite the striking resemblance between the balancing exercises in Articles 6(1)f and 21(1), there are some key differences. The right to object explicitly requires the data subject to put forward grounds relating to his/her particular situation, but at the same time puts the onus on the controller to establish overriding ‘compelling legitimate grounds’. The shift in burden of proof when compared to its predecessor (Art.14) in Directive 95/46, seems to have settled halfway. Both controller and data subject have one, though the controller’s is more onerous. In light of the accountability principle in Article 5(2), the burden of proof for establishing a balance under Art.6(1)f a priori lies with the controller alone. But, to all intents and purposes, when a data subject wishes to effectively challenge this balance, he/she will have to substantiate that claim.

Importantly, Art.21(1) does not include a reference to third party interests, which makes it seem more narrow than Art.6(1)f’s balance. Reading more closely, however, the provision merely requires the controller to demonstrate overriding ‘compelling legitimate grounds’. The language does not constrain these grounds to be of the controller alone and could therefore encompass much more than only ‘the interests pursued by the controller or by a third party’ in Art.6(1)f. This would render it harder for data subjects to exercise their right to object than to challenge the lawfulness of the processing operation(s) itself. This puzzling conclusion might partially be countered by the fact that Article 21(1) does not require the rights and freedoms of the data subject to be fundamental (contrary to Art.6(1)f).

Things get even more confusing when reading recital 69, which conflicts with Art.21(1) on three occasions, referring to (a) the controller’s own legitimate interests only; (b) ‘interests’ instead of ‘grounds’, the former being narrower; and (c) data subjects’ fundamental rights and freedoms. The end-result is that it is unclear how the balancing exercise under Art.21(1) should be performed exactly, leaving data subjects less empowered.

In conclusion, one may wonder what the added value of Article 21(1)’s balancing exercise is in the first place. Firstly, data subjects are free – at least in theory – to challenge lawful grounds (including Art.6(1)f) already anyway. Secondly, a closer look at Art.21(1) suggests that its balancing exercise is less likely to benefit data subjects than the one in Article 6(1)f. In light of data protection law’s rationale – routinely confirmed by the CJEU – the right to object should be interpret from the perspective of ensuring a high and effective level of protection. Such a systematic reading implies the right to object’s balancing exercise should be mirrored against (not equated to) the one in Art.6(1)f. The right to object can still be successful even though the processing is lawful stricto sensu under Article 6(1)f. The main added value of Art.21(1) therefore appears to be (a) allocating the burden of proof; and (b) empowering data subjects to challenge the status quo.

The Personal Data Equaliser

[Note: This post was originally published on the CiTiP Blog]

The concept of personal data – key in determining data protection law’s material scope of application – may seem pretty straightforward in the abstract. In practice, particularly when assessing the applicability of specific data subject rights, things get a lot murkier.

 Personal Data – a Disharmonious Concept

It is a truism to say that Personal Data constitutes data protection law’s central building block. Indeed, personal data is the key factor in determining the framework’s applicability. Directive 95/46 – as well as the upcoming General Data Protection Regulation (GDPR) – are pretty concise in defining the concept: “any information relating to an identified or identifiable natural person.” No a priori distinction is made between different sources, types, formats, and so on. Only the contentious sub-category of ‘sensitive data’ explicitly enjoys special status. Personal data’s incredibly wide definition has been documented and criticised widely. Data protection law’s ‘information-agnosticism’ is also an important element separating it from the general right to privacy. The latter primarily covering more intimate data (or elements/activities) affecting an individuals’ private sphere.

In reality, not all (personal) data are equal. Many legal texts specify or differentiate particular kinds of personal data because of the heightened risk to the individuals’ rights, interests or freedoms. Even though the data protection framework explicitly refers to some sub-categories of data – sensitive data, (online) identifiers, pseudonyms, traffic and location data – it appears unfeasible to devise an overall taxonomy for personal data. The many attempts that have been made – in privacy policies, by consultants and academics in different fields – fail to offer a satisfactory and comprehensive overview.

The few data categories explicitly/implicitly appearing throughout the GDPR are useful indicators for assessing the extent of rights and obligations. However, the applicability of data subject rights – and the right to erasure in particular – cannot be reduced to a mere qualification of the underlying data in one of these categories. Google’s and Facebook’s privacy policies differentiate personal data on the basis of its origin, its form and/or its function. But data can also be differentiated on the basis of its nature, sensitivity or visibility/obscurity. To complicate things even more, predefined categories often overlap in practice, further rendering nonsensical any attempts at straightjacketing personal data into predefined categories.

Still, the category of data will often impact the exercise of data subject rights. The rights to erasure and to object illustrate this quite well. Different data-types will have a different impact on the data subject and the balancing exercise generally accompanying a request to erase/object. Sensitive data may be the most obvious example of a data category that will generally tip the balance in favour of the data subject. In short, there is clearly some merit in qualifying the relevant personal data in one way or another. In light of the concept’s incredible heterogeneity however, attempts at developing a comprehensive ‘personal data taxonomy’ are doomed from the start.

 Personal Data Equaliser

Instead of trying to come up with a data taxonomy – or even a more modest list of specific data categories – an alternative can be envisaged. From the perspective of exercising one’s data subject rights, it makes more sense to identify relevant variables on a case-by-case basis. These may relate to the data itself (e.g. accuracy, public interest, sensitivity, format), the source (e.g. voluntarily shared, inferred), the data subject (e.g. role in public life, child), time, context, etc. Each of these ‘variables’ – some of which correspond with categories in obsolete data taxonomies – should be seen as non-binary continuums.

By analogy, one could think of an audio equaliser, ubiquitous in eighties’ stereo sound-systems. Every slider represents a variable, impacting – to a greater or lesser extent – what comes out of the speakers. Similarly to its audio-counterpart, the ‘personal data equaliser’, comes with certain pre-sets. For certain situations or ‘data types’, there will be pre-defined defaults. Depending on the circumstances, certain sliders will be hardwired (e.g. format of the data, controller), whereas others might still be tweakable (e.g. visibility/obscurity). Crucially, determining the configuration of parameters is only possible a posteriori, when evaluating the applicability of data subjects’ rights in a particular case.

The Data Equaliser acknowledges the complexity of today’s information processing landscape. It recognises the impossibility of a priori determining the potential implications on an individual of one type of personal data or another. Today’s vast – and quickly expanding – data processing eco-system transforms seemingly trivial and/or anonymous data into personal data and vice versa. Unsurprisingly, determining the reach of data protection rights (notably, the right to erasure) is a tough exercise in the abstract. Though helpful indicators, the personal data categories defined by the legislator do not offer quick-and-easy answers either. The idea behind the ‘personal data equaliser’ recognises the messiness of data and the importance of looking at the particular circumstances of each individual case. It acknowledges the fluidity of ‘personal data’, depending on time and context.

Looking ahead, attempts at bringing more structure to the concept of personal data should focus on identifying potential variables rather than types of personal data. Such a functional approach will be much more valuable to the interpretation of data subject rights in practice.

French Court ordering Selfie-Takedown

From the Maw-Law Blog: “On December 12, 2015, Brahim Zaibat, a dancer and choreographer, posted on social media a selfie he had taken two years ago, showing him in an airplane, just a seat behind the one where Jean-Marie Le Pen, the honorary president of the French National Front, had fallen asleep. …” Read more at:

Irish Case highlights different tiers of responsibilities regarding RtbF implementation

The Irish Times recently reported that: “Google has refused to remove search results that show the names of newly naturalised Irish citizens in the State’s official gazette, because of the Government’s “ongoing choice” to make the information public.

Indeed, after the individuals approached the Irish DP Commissioner, the latter stated that the publication of information is mandated by a 1956 law.

Shortly after the Irish Times reported about these issues, the official government website publishing the names made a technical change, in order to prevent the listing by Search Engines (while still maintaining the actual content).

The overlap with both the Google Spain Case and the still pending Manni Case is striking and demonstrates a broader issue. Though search engines clearly have some responsibility in the delisting process, this seems the right time for regulators and official public registers to proactively reassess their – often decades-old – publication/divulgation policies.

The divulgation or removal of information should not be seen as a binary, nor can responsibilities in this regard simply be imposed on one entity exclusively.

CJEU is asked to rule on the ‘Right to be Forgotten’ again

The Italian Supreme Court recently asked the CJEU for a preliminary ruling on two questions regarding the ‘right to be forgotten’.

[Disclaimer: this information is loosely translated from official documents published by the Dutch Ministry of Foreign Affairs. The original request can be found here [it]. The CJEU’s documents folder (still empty at the time of writing) can be found here.]

Facts & Procedure (Case C-398/15 – Manni)

The original plaintiff (Salvatore Manni)’s business had gone bankrupt in 1992. This was added to a public Company Register, managed by the defendant (Camera di commercio di Lecce). Plaintiff argued he (his business of selling houses in particular) suffers damages and requested defendant to anonymise his name or restrict access to the register. Defendant stated that the ‘Companies Register’ is a public database with the primary function of informing (on request) about relevant information of companies. The case escalated all the way to the Italian Supreme Court (Corte Suprema di Cassazione), which referred to questions to the CJEU.

Questions referred

The Italian Court essentially wonders whether information legally consigned to (and made public by) the defendant, can be erased, anonymised or access-restricted after a certain time. The Court does point out the importance of the public Register (for legal certainty). Referring to the Google Spain Case (C-131/12), the Court asks not whether the information should be erased from the Register, but whether limits should be put as to the (further) use of this public information by third parties.

  1. Does Article 6(1)(e) of the Data Protection Directive supersede the making public through the company register as commended by Directive 68/151/EEG and corresponding national legislation, to the extent that the latter requires that anyone should have have access to the personal data in the register without restrictions?
  2. Does Article 3 of Directive 68/151/EEG allow, in contrast with the rule that the Company Register saves public information for an indeterminate time and can be consulted by anyone, the information to be made no longer ‘public’, though still available to a specific group, and this to be decided on a case-by-case basis by the Register’s manager?


The underlying facts in this ‘Manni’ Case, are strikingly similar to the ones in the Google Spain Case. Instead of focusing on the third-party, however, the CJEU is now asked to evaluate the obligations of the original publisher. In Google Spain, it was already decided (by the national DPA) that the original publication could not be touched before even reaching the CJEU. In the Manni Case, the original source also has a legal obligation to publish. Yet, it is not asked to remove personal data from the source altogether. Only whether the source can be asked to make it less accessible. This raises very interesting questions – left unanswered in Google Spain – as to the obligations on the shoulders of the original publishers and different degrees of publicity.

To be continued…!

Forget, Erase & Delist – But Don’t Forget The Broader Issue

Current State of Affairs

A new year, a new CPDP Conference (Computers, Privacy & Data Protection, 21-23 January 2015). In the past 12 months we have seen privacy and data protection issues taking a much more prominent role in many different internet policy discussions. One of the key examples in this regard is the so-called Google Spain case by the Court of Justice of the EU (CJEU). Acknowledging the right of individuals to ask search engines to delist certain name-based search results, the ruling sent shockwaves through internet policy circles. If anything, the case has made us all re-think the balance between different fundamental rights and interests, the allegedly ‘neutral’ role of search engines and the extra-territorial reach of local regulations. Besides the unprecedented public debate and media coverage, the CJEU’s decision also resulted in the Article 29 Working Party publishing interpretation guidelinesand Google setting up its very own ‘Advisory Council’ which held public hearings across the EU.

Unsurprisingly, the Google Spain ruling is usually talked about against the backdrop of the so-called ‘Right to be Forgotten’. This ‘right’ has been criticised fiercely by freedom of expression advocates and is emblematic of the fissure between the US and EU regarding online privacy policy-making. Nonetheless, there is at least one point all sides seem to agree on: the terminology is very problematic. Hence, it is great to see that the latest draft of the EU’s ‘Proposal for a Data Protection Regulation’ simply refers to the ‘right to erasure’ (Article 17). Still, the provision has been attacked for being unclear on both its scope and how it is to be implemented. But can we really be that upset about this? Isn’t the exact goal of legal norms to put forward an abstract – and especially future-proof – principle that should be interpreted differently, depending on the relevant facts and context of each case? Shouldn’t we rather look at the judicial (courts) and executive (e.g. Data Protection Authorities) branches to help make sense of the rules put forward by the legislator?

Some Thoughts on Balancing

Requesting the removal of certain information (on whatever legal ground) will always generate a conflict of interests and rights. In the context of the ‘right to be forgotten/erasure’ debate, the most recurring conflicts relate to either the right to freedom of expression (Article 11, CFREU) or economic freedoms (Article 16, CFREU). The ‘conflict’ between privacy and freedom of expression interests, however, is immensely inflated (to the great benefit of the big data industry).

As most readers will know, Europe has a rich legal tradition of balancing the two rights (most notably in the case law of the European Court of Human Rights), with clear criteria and safeguards for restricting either right. As a side note, one can only applaud the CJEU in Google Spain for clearly distinguishing the responsibilities and protections of actual speakers (i.e., newspapers) from third parties (i.e., search engines), each subject to a different balancing test.

Additionally, if you look at Google’s transparency report, it becomes clear that the majority of delisting requests does not relate to legitimate news reporting websites in the first place. Instead, most requestors seem to be average Joe’s concerned about how websites like or gratuitously show information about them on the basis of a mere name search. In short, the ‘right to be delisted’ is about online obscurity, not about eradicating information from the internet altogether.

More importantly, we should not be swayed by ‘right to be forgotten’ rhetoric professing that it constitutes a fundamental threat to freedom of expression online. The right to erasure has a much more important – and largely understated – goal: empowering data subjects with regard to their data being harvested and exploited ‘behind the scenes’ (e.g. for commercial/political profiling, digital market manipulation, etc.). No conflict with the right to freedom of expression exists in these contexts. Instead, these situations usually require a balancing exercise between individuals’ privacy and data protection interests on the one hand and the data controller’s economic freedoms (Article 16, CFREU) on the other. With regard to the latter, the Google Spain case made clear that such freedoms weigh considerably less when compared to freedom of expression interests. Still, neither fundamental right/interest can be discarded without giving it due regard in the balancing exercise first.


The right to erasure undoubtedly results in a conflict of rights/interests that needs to be solved. Whereas many cases will be very straightforward, a considerable portion will require a more thorough balancing exercise. We should be wary, however, not to be blinded by the rhetoric of freedom of expression absolutists, libertarians and corporate lobbyists defending their own agendas. We should not want to reinvent the wheel either. Balancing exercises and the proportionality principle are deeply embedded in the European legal framework. Applying them to the issue(s) at hand might not always be straightforward. But is it really asking too much from entities that, ultimately, benefit from using personal data?

Finally, the whole debate on the ‘Right to be Forgotten’, the ‘Right to Erasure’ and the GoogleSpain case seems to be far from finished. I hope you join me in congratulating Computers, Privacy & Data Protection (CPDP) for having provided a fertile platform (books and panels) for discussing these issues.

*This BlogPost originally appeared as an Op-Ed on the Internet Policy Review*

R2E – R2O – RtbF: What’s in a Name?

– Excerpt from Draft Article –

The Google Spain case has definitely added fuel to the fire regarding the so-called ‘Right to be Forgotten’ debate. Much of the discourse, however, has mixed up several related concepts. Hence, it is important (once again) to distinguish and clarify three important notions: the ‘right to be forgotten’, the ‘right to object’ and the ‘right to erasure’. First of all, the so-called ‘right to be forgotten’ is not mentioned in the Directive (and it has been removed from the latest DPR draft as well), but is rather used as a catchphrase in the rhetoric of different sides in the debate (pro and contra). In fact, the ‘right to be forgotten’ can be described as an umbrella term, aiming to encapsulate different important rights. Its very name already causes confusion by suggesting an obligation on third parties to ‘forget’. Instead, it is a rather clumsy translation into law of a broader policy goal. In this regard, it can be traced back to the French droit a l’oubli. This ‘right’ has traditionally been applied in situations where an individual was confronted with publicity of his personal life in the media in a disproportionate or unreasonable way (e.g. an ex-convict who sees new articles appearing decades after the facts). Nevertheless, it has no dedicated legal ground and is usually invoked on the basis of a variety of legal frameworks (e.g. right of personal portrayal, defamation, general right to privacy, etc.). Given the potential conflict with the right to freedom of expression, the right has only been applied sporadically by Courts. In any situation, this right implies the presence of an (potential) imminent harm that can only be prevented by removing the information or at least preventing its (further) publicity.

The right to object and right to erasure can be found in the data protection framework. The rationale of these rights is not as much to prevent/withdraw the publication of one’s personal data, but rather to empower data subjects. Instead of rights to ‘stop me from speaking about you’, they are intended as a check on how personal data is used and allow individuals to control the use of their personal data over time. The right to object (article 14) can be invoked on the basis of compelling and legitimate grounds, relating to one’s particular situation. It should be emphasized, however, that this right only relates to a specific processing activity. When successfully exercised, the controller will not be allowed to process the personal data for the purposes objected to anymore. The same personal data might still be processed for different purposes for as long as these other activities comply with the data protection framework. A social network, for example, will not be able to process my personal data for direct marketing anymore, but can still use it for other purposes (e.g. statistics, personalisation, etc.). The right to erasure (art. 12(b)) on the other hand, addresses the personal data itself. It can be invoked whenever the controller does not comply with the Directive, in particular because of the incomplete or inaccurate nature of the data. In other words, when the data subject can demonstrate the controller has violated any of its legal obligations under Directive 95/46, it can – depending on the facts – obtain the removal of his/her personal data. If successful, the data cannot be used for any other purpose.

To summarise, whereas the right to object relates to a specific processing activity, the right to erasure relates to the data itself. The ‘right to be forgotten’ constitutes a catchy – though deceptive – policy goal.


Court of Justice Finally issues Judgment in Google Spain Case (C-131/12)

*This BlogPost is based on a piece written for the LSE Media Policy Blog and Internet Policy Review*


The Court of Justice of the European Union (CJEU) finally released its long-awaited judgment in the Google Spain (C-131/12) case. In short, the Court decided that individuals do have a right to request search engines to remove links to webpages when the individual’s name is used as a search query. This ruling cannot be overturned and is now referred back to the national court. Theoretically, it is still possible for Google to take this case to the European Court of Human Rights (based on article 10 ECHR) once the national Court makes a final decision.

Although the Case is often referred to as the Right to be Forgotten Case, it does not hinge upon the similarly named provision in the proposed Data Protection Regulation. Instead, the main legal basis in this decision was the Data Protection Directive 95/46 (hereafter: ‘the Directive’), including the rights to object (art.14) and to erasure (12(b)). The case is particularly interesting because it lies at the intersection of data protection law, freedom of expression and (a detailed discussion on this interaction is available here).


The facts of the case concerned a Spanish citizen who was subject to bankruptcy proceedings in the nineties. Spanish law dictated that links to the public auction following this bankruptcy were published in a local newspaper (LaVanguardia). In the late 2000s, the citizen discovers that links to this newspaper article appear as the top results when entering his name into Google’s search engine. All requests directed to the newspaper to takedown – or at least anonymise – the respective article were unsuccessful. After all, it had the legal obligation to publish the information in the first place. The Spanish data protection authority did rule, however, that Google should take down links to the article when entering the individual’s name. The search engine appealed to this decision and the case was brought before the Audiencia Nacional, which in turn referred three questions to the Court of Justice of the EU for a preliminary ruling.

Last June, Advocate General Jääskinen already issued an Opinion in this case, which in turn sparked a lot of academic debate. In this Opinion, the AG concluded that data subjects do not have a right to erasure vis-à-vis search engines with regard to information, published legally on third parties’ web pages (§.138 of the Opinion).



Given the complexity of the case and the nuanced wording of the decision, it will take many readings to form a more definitive opinion about this ruling. However, here are my first thoughts.

The CJEU was asked to answer three main questions, relating to (1) the territorial scope of the Directive; (2) the material and personal scope of the Directive; and (3) whether or not data subjects have a right to object/erasure when it comes to search engines directly.

Scope of Application

With regard to the first two questions, the Court was rather straight-forward. To the extent that ‘the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State’, the processing falls within the territorial scope of application of the Directive (art.4(1)a) (§.60).

Given the fact that search engines ‘collect’, ‘retrieve’, ‘record’, ‘organize’, ‘store’ and ‘make available’, they do process personal data, and thus fall within the material scope of application of the Directive (art.2(b)) (§.28-29).

The Court also specified that search engines’ activities can be distinguished from (and are additional to) those carried out by the original publisher(s). Hence, they should be considered controllers (art.2(d)) (§.41).


Right to be Forgotten?

The third category of questions that was presented to the CJEU, related to the so-called right to be forgotten and constitutes the most controversial aspect in this case. Some of the key issues are:

  • Limited scope of the judgment

First of all it is important not to overemphasise the impact of this judgment on the right to freedom of expression (art. 11 Charter; art.10 ECHR). In this particular case, the request related specifically to the link between using an individual’s name as a search query and the search result referring to a particular webpage. In other words, even if the request is granted, the same webpage can still be reached through other – maybe more relevant – search terms.

  • No obligation to delete, but an obligation to balance

One should not conclude that any individual can now request search engines to delete links to webpages when their name is used as a search term. Instead, such requests will still have to comply with the requirements under article 12(b) (right to erasure) and/or article 14 (right to object). Put briefly, these provisions require a balance to be made between opposing rights and interests (§.74; 76). Hence, the plaintiff will have to substantiate his/her request and upon receiving such a request, the search engine will have to make the necessary balance. If the search engine does not grant the request, the CJEU specified that ‘the data subject can bring the matter before the supervisory or judicial authority so that it carries out the necessary checks’ (§.77). In other words, search engines are not obliged to comply with takedown requests, unless a supervisory or judicial authority issues them.

  • Independent responsibility of Search Engines

This observation ties back to the personal scope of the Directive. It was emphasised throughout the judgement that Google’s activities can clearly be distinguished from those of the original publishers. The potential harm or negative consequences vis-à-vis the data subject will in many cases not result from an obscure publication in a local online newspaper, but rather from the widespread (and often decontextualised) availability of the information through search engines. A logical consequence is that even though the original content is published lawfully, data subjects will still be able to request the removal from search engines directly. It is important to distinguish this from potential requests directed to the original publisher (e.g. to remove or blur out his/her personal data) (§. 39).

  • Over-responsibilisation?

Upon first reading, one could claim the judgment puts to big a burden on search engines. After all, paragraph 38 specifically states that the operator of a search must comply with all the requirements in the Directive. It goes without saying that subjecting search engines to the full application of the data protection Directive, gives rise to considerable concerns. On the other hand, the judgment does specify that search engines only need to comply with the Directive ‘within the framework of their responsibilities, powers and capabilities’ (§.38; 83). It is still too early, however, to predict how this will play out in practice.

  • Presumption that data subject’s rights trump all others

One of the most important concerns I have at this stage, concerns the Court’s presumption that ‘data subject’s rights […] override, as a general rule, the interest of internet users…’ as well as the economic interests of the search engine operator itself (§.81). In other words, it seems that the court suggests an imbalance of interests should be presumed, favouring privacy interests over all others. However, the Court does seem to nuance this by stating the balance might depend on the nature of the information, its sensitivity, the interest of the public, the role of the relevant individual in public life, etc. Needless to say that this wording is not conducive to legal certainty.


Today’s ruling by the Court of Justice in Google Spain undoubtedly raised many eyebrows. Surprisingly it almost entirely goes against the Opinion of the Advocate General issued in June 2013. Nevertheless, it is still too early to draw general conclusions from the judgement. Even though at first glance it seems to considerably threaten freedom of expression/information interests, much of the wording seems to be very nuanced and limited in scope when looked at more closely. Additionally, the decision is entirely based on the existing legal framework (Directive 95/46). It is hard to predict how the judgment will interact with the future data protection Regulation, which is already being drafted..

Hosting Platforms after the Italian GoogleVideo Case – Data Controllers or not?

In its long awaited judgement, the Italian Supreme Court ruled that Google Video could be not be deemed a data controller with regard to the videos it hosts on its platform. As a result, they cannot be held responsible for the dissemination of these videos. The Court specified that the rules ‘presuppose actual decision-making power over (a) the purposes and means of the relevant processing (dissemination to the public); and (b) the balancing between different rights and interests at stake. It can be deduced from the existing framework that this decision making power depends on the existence of actual knowledge. In other words, Google Video only becomes responsible (data controller) from the moment it is made aware. This interpretation, the Court explained, is in line with what is written down in the eCommerce Directive (exemption of hosting providers and no general obligation to monitor, artt.14-15).

It is worth saying, however, that many processing activities are relevant in this context. The dissemination of the video (containing personal data) is one processing activity, for which the uploader should be considered controller. But, besides this, the video (and hence the personal data contained within) is potentially subject to many other processing activities as well (analysis for behavioural marketing purposes, facial recognition, etc.). With regard to this second strand of uses of the data, a strong argument can be made for the hosting platform to be the data controller. After all, they are determining the purpose and means of these specific activities.

Because in the case at hand, it was mainly the activity of dissemination that was objected to, the original controller bears primary responsibility. But this should not overshadow the responsibilities of hosting platforms (and the like) for the plethora of other processing activities the data is subject to.


Hacking your Smart TV…

Last summer, at Blackhat 2013, a Korean researcher presented the vulnerabilities of increasingly popular ‘Smart TVs’ (>80million units sold in 2012). His slides can be found here.

The following aspects are particularly worth mentioning from a privacy law perspective:

  • Attractive target for hackers:
    • Low security
    • Always powered/connected
    • Camera + microphone much more attractive than smartphone (which is often on a desk, put away and moves around a lot)
    • Centrally located in household
    • Often many other antennas inside (Bluetooth, WiFi, etc.)
    • Presence of an App Store makes it easy to disseminate malware
    • Many different points for attack: physical, USB/other ports, remote control, broadcast signals, etc.
  • Issues:
    • TV is a Black Box
    • OS is very big
    • Very vulnerable for attacks via apps such as social networks
  • What can be done:
    • Hijacking TV programs
    • Key-Logging
    • Capturing TV screenshots
    • Sniffing network traffic
    • Stealing (financial) information
    • capturing camera/mic feeds
    • All of this can be done in good quality and 24/7 (researcher demonstrated that monitoring was still possible after user ‘turns off’ TV)

Yet another clear example of the need for creators of technology to take seriously the principles of Privacy by Design, Data Security and Data Minimisation… .