The Personal Data Equaliser

[Note: This post was originally published on the CiTiP Blog]

The concept of personal data – key in determining data protection law’s material scope of application – may seem pretty straightforward in the abstract. In practice, particularly when assessing the applicability of specific data subject rights, things get a lot murkier.

 Personal Data – a Disharmonious Concept

It is a truism to say that Personal Data constitutes data protection law’s central building block. Indeed, personal data is the key factor in determining the framework’s applicability. Directive 95/46 – as well as the upcoming General Data Protection Regulation (GDPR) – are pretty concise in defining the concept: “any information relating to an identified or identifiable natural person.” No a priori distinction is made between different sources, types, formats, and so on. Only the contentious sub-category of ‘sensitive data’ explicitly enjoys special status. Personal data’s incredibly wide definition has been documented and criticised widely. Data protection law’s ‘information-agnosticism’ is also an important element separating it from the general right to privacy. The latter primarily covering more intimate data (or elements/activities) affecting an individuals’ private sphere.

In reality, not all (personal) data are equal. Many legal texts specify or differentiate particular kinds of personal data because of the heightened risk to the individuals’ rights, interests or freedoms. Even though the data protection framework explicitly refers to some sub-categories of data – sensitive data, (online) identifiers, pseudonyms, traffic and location data – it appears unfeasible to devise an overall taxonomy for personal data. The many attempts that have been made – in privacy policies, by consultants and academics in different fields – fail to offer a satisfactory and comprehensive overview.

The few data categories explicitly/implicitly appearing throughout the GDPR are useful indicators for assessing the extent of rights and obligations. However, the applicability of data subject rights – and the right to erasure in particular – cannot be reduced to a mere qualification of the underlying data in one of these categories. Google’s and Facebook’s privacy policies differentiate personal data on the basis of its origin, its form and/or its function. But data can also be differentiated on the basis of its nature, sensitivity or visibility/obscurity. To complicate things even more, predefined categories often overlap in practice, further rendering nonsensical any attempts at straightjacketing personal data into predefined categories.

Still, the category of data will often impact the exercise of data subject rights. The rights to erasure and to object illustrate this quite well. Different data-types will have a different impact on the data subject and the balancing exercise generally accompanying a request to erase/object. Sensitive data may be the most obvious example of a data category that will generally tip the balance in favour of the data subject. In short, there is clearly some merit in qualifying the relevant personal data in one way or another. In light of the concept’s incredible heterogeneity however, attempts at developing a comprehensive ‘personal data taxonomy’ are doomed from the start.

 Personal Data Equaliser

Instead of trying to come up with a data taxonomy – or even a more modest list of specific data categories – an alternative can be envisaged. From the perspective of exercising one’s data subject rights, it makes more sense to identify relevant variables on a case-by-case basis. These may relate to the data itself (e.g. accuracy, public interest, sensitivity, format), the source (e.g. voluntarily shared, inferred), the data subject (e.g. role in public life, child), time, context, etc. Each of these ‘variables’ – some of which correspond with categories in obsolete data taxonomies – should be seen as non-binary continuums.

By analogy, one could think of an audio equaliser, ubiquitous in eighties’ stereo sound-systems. Every slider represents a variable, impacting – to a greater or lesser extent – what comes out of the speakers. Similarly to its audio-counterpart, the ‘personal data equaliser’, comes with certain pre-sets. For certain situations or ‘data types’, there will be pre-defined defaults. Depending on the circumstances, certain sliders will be hardwired (e.g. format of the data, controller), whereas others might still be tweakable (e.g. visibility/obscurity). Crucially, determining the configuration of parameters is only possible a posteriori, when evaluating the applicability of data subjects’ rights in a particular case.

The Data Equaliser acknowledges the complexity of today’s information processing landscape. It recognises the impossibility of a priori determining the potential implications on an individual of one type of personal data or another. Today’s vast – and quickly expanding – data processing eco-system transforms seemingly trivial and/or anonymous data into personal data and vice versa. Unsurprisingly, determining the reach of data protection rights (notably, the right to erasure) is a tough exercise in the abstract. Though helpful indicators, the personal data categories defined by the legislator do not offer quick-and-easy answers either. The idea behind the ‘personal data equaliser’ recognises the messiness of data and the importance of looking at the particular circumstances of each individual case. It acknowledges the fluidity of ‘personal data’, depending on time and context.

Looking ahead, attempts at bringing more structure to the concept of personal data should focus on identifying potential variables rather than types of personal data. Such a functional approach will be much more valuable to the interpretation of data subject rights in practice.

Irish Case highlights different tiers of responsibilities regarding RtbF implementation

The Irish Times recently reported that: “Google has refused to remove search results that show the names of newly naturalised Irish citizens in the State’s official gazette, because of the Government’s “ongoing choice” to make the information public.

Indeed, after the individuals approached the Irish DP Commissioner, the latter stated that the publication of information is mandated by a 1956 law.

Shortly after the Irish Times reported about these issues, the official government website publishing the names made a technical change, in order to prevent the listing by Search Engines (while still maintaining the actual content).

The overlap with both the Google Spain Case and the still pending Manni Case is striking and demonstrates a broader issue. Though search engines clearly have some responsibility in the delisting process, this seems the right time for regulators and official public registers to proactively reassess their – often decades-old – publication/divulgation policies.

The divulgation or removal of information should not be seen as a binary, nor can responsibilities in this regard simply be imposed on one entity exclusively.

R2E – R2O – RtbF: What’s in a Name?

– Excerpt from Draft Article –

The Google Spain case has definitely added fuel to the fire regarding the so-called ‘Right to be Forgotten’ debate. Much of the discourse, however, has mixed up several related concepts. Hence, it is important (once again) to distinguish and clarify three important notions: the ‘right to be forgotten’, the ‘right to object’ and the ‘right to erasure’. First of all, the so-called ‘right to be forgotten’ is not mentioned in the Directive (and it has been removed from the latest DPR draft as well), but is rather used as a catchphrase in the rhetoric of different sides in the debate (pro and contra). In fact, the ‘right to be forgotten’ can be described as an umbrella term, aiming to encapsulate different important rights. Its very name already causes confusion by suggesting an obligation on third parties to ‘forget’. Instead, it is a rather clumsy translation into law of a broader policy goal. In this regard, it can be traced back to the French droit a l’oubli. This ‘right’ has traditionally been applied in situations where an individual was confronted with publicity of his personal life in the media in a disproportionate or unreasonable way (e.g. an ex-convict who sees new articles appearing decades after the facts). Nevertheless, it has no dedicated legal ground and is usually invoked on the basis of a variety of legal frameworks (e.g. right of personal portrayal, defamation, general right to privacy, etc.). Given the potential conflict with the right to freedom of expression, the right has only been applied sporadically by Courts. In any situation, this right implies the presence of an (potential) imminent harm that can only be prevented by removing the information or at least preventing its (further) publicity.

The right to object and right to erasure can be found in the data protection framework. The rationale of these rights is not as much to prevent/withdraw the publication of one’s personal data, but rather to empower data subjects. Instead of rights to ‘stop me from speaking about you’, they are intended as a check on how personal data is used and allow individuals to control the use of their personal data over time. The right to object (article 14) can be invoked on the basis of compelling and legitimate grounds, relating to one’s particular situation. It should be emphasized, however, that this right only relates to a specific processing activity. When successfully exercised, the controller will not be allowed to process the personal data for the purposes objected to anymore. The same personal data might still be processed for different purposes for as long as these other activities comply with the data protection framework. A social network, for example, will not be able to process my personal data for direct marketing anymore, but can still use it for other purposes (e.g. statistics, personalisation, etc.). The right to erasure (art. 12(b)) on the other hand, addresses the personal data itself. It can be invoked whenever the controller does not comply with the Directive, in particular because of the incomplete or inaccurate nature of the data. In other words, when the data subject can demonstrate the controller has violated any of its legal obligations under Directive 95/46, it can – depending on the facts – obtain the removal of his/her personal data. If successful, the data cannot be used for any other purpose.

To summarise, whereas the right to object relates to a specific processing activity, the right to erasure relates to the data itself. The ‘right to be forgotten’ constitutes a catchy – though deceptive – policy goal.


Court of Justice Finally issues Judgment in Google Spain Case (C-131/12)

*This BlogPost is based on a piece written for the LSE Media Policy Blog and Internet Policy Review*


The Court of Justice of the European Union (CJEU) finally released its long-awaited judgment in the Google Spain (C-131/12) case. In short, the Court decided that individuals do have a right to request search engines to remove links to webpages when the individual’s name is used as a search query. This ruling cannot be overturned and is now referred back to the national court. Theoretically, it is still possible for Google to take this case to the European Court of Human Rights (based on article 10 ECHR) once the national Court makes a final decision.

Although the Case is often referred to as the Right to be Forgotten Case, it does not hinge upon the similarly named provision in the proposed Data Protection Regulation. Instead, the main legal basis in this decision was the Data Protection Directive 95/46 (hereafter: ‘the Directive’), including the rights to object (art.14) and to erasure (12(b)). The case is particularly interesting because it lies at the intersection of data protection law, freedom of expression and (a detailed discussion on this interaction is available here).


The facts of the case concerned a Spanish citizen who was subject to bankruptcy proceedings in the nineties. Spanish law dictated that links to the public auction following this bankruptcy were published in a local newspaper (LaVanguardia). In the late 2000s, the citizen discovers that links to this newspaper article appear as the top results when entering his name into Google’s search engine. All requests directed to the newspaper to takedown – or at least anonymise – the respective article were unsuccessful. After all, it had the legal obligation to publish the information in the first place. The Spanish data protection authority did rule, however, that Google should take down links to the article when entering the individual’s name. The search engine appealed to this decision and the case was brought before the Audiencia Nacional, which in turn referred three questions to the Court of Justice of the EU for a preliminary ruling.

Last June, Advocate General Jääskinen already issued an Opinion in this case, which in turn sparked a lot of academic debate. In this Opinion, the AG concluded that data subjects do not have a right to erasure vis-à-vis search engines with regard to information, published legally on third parties’ web pages (§.138 of the Opinion).



Given the complexity of the case and the nuanced wording of the decision, it will take many readings to form a more definitive opinion about this ruling. However, here are my first thoughts.

The CJEU was asked to answer three main questions, relating to (1) the territorial scope of the Directive; (2) the material and personal scope of the Directive; and (3) whether or not data subjects have a right to object/erasure when it comes to search engines directly.

Scope of Application

With regard to the first two questions, the Court was rather straight-forward. To the extent that ‘the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State’, the processing falls within the territorial scope of application of the Directive (art.4(1)a) (§.60).

Given the fact that search engines ‘collect’, ‘retrieve’, ‘record’, ‘organize’, ‘store’ and ‘make available’, they do process personal data, and thus fall within the material scope of application of the Directive (art.2(b)) (§.28-29).

The Court also specified that search engines’ activities can be distinguished from (and are additional to) those carried out by the original publisher(s). Hence, they should be considered controllers (art.2(d)) (§.41).


Right to be Forgotten?

The third category of questions that was presented to the CJEU, related to the so-called right to be forgotten and constitutes the most controversial aspect in this case. Some of the key issues are:

  • Limited scope of the judgment

First of all it is important not to overemphasise the impact of this judgment on the right to freedom of expression (art. 11 Charter; art.10 ECHR). In this particular case, the request related specifically to the link between using an individual’s name as a search query and the search result referring to a particular webpage. In other words, even if the request is granted, the same webpage can still be reached through other – maybe more relevant – search terms.

  • No obligation to delete, but an obligation to balance

One should not conclude that any individual can now request search engines to delete links to webpages when their name is used as a search term. Instead, such requests will still have to comply with the requirements under article 12(b) (right to erasure) and/or article 14 (right to object). Put briefly, these provisions require a balance to be made between opposing rights and interests (§.74; 76). Hence, the plaintiff will have to substantiate his/her request and upon receiving such a request, the search engine will have to make the necessary balance. If the search engine does not grant the request, the CJEU specified that ‘the data subject can bring the matter before the supervisory or judicial authority so that it carries out the necessary checks’ (§.77). In other words, search engines are not obliged to comply with takedown requests, unless a supervisory or judicial authority issues them.

  • Independent responsibility of Search Engines

This observation ties back to the personal scope of the Directive. It was emphasised throughout the judgement that Google’s activities can clearly be distinguished from those of the original publishers. The potential harm or negative consequences vis-à-vis the data subject will in many cases not result from an obscure publication in a local online newspaper, but rather from the widespread (and often decontextualised) availability of the information through search engines. A logical consequence is that even though the original content is published lawfully, data subjects will still be able to request the removal from search engines directly. It is important to distinguish this from potential requests directed to the original publisher (e.g. to remove or blur out his/her personal data) (§. 39).

  • Over-responsibilisation?

Upon first reading, one could claim the judgment puts to big a burden on search engines. After all, paragraph 38 specifically states that the operator of a search must comply with all the requirements in the Directive. It goes without saying that subjecting search engines to the full application of the data protection Directive, gives rise to considerable concerns. On the other hand, the judgment does specify that search engines only need to comply with the Directive ‘within the framework of their responsibilities, powers and capabilities’ (§.38; 83). It is still too early, however, to predict how this will play out in practice.

  • Presumption that data subject’s rights trump all others

One of the most important concerns I have at this stage, concerns the Court’s presumption that ‘data subject’s rights […] override, as a general rule, the interest of internet users…’ as well as the economic interests of the search engine operator itself (§.81). In other words, it seems that the court suggests an imbalance of interests should be presumed, favouring privacy interests over all others. However, the Court does seem to nuance this by stating the balance might depend on the nature of the information, its sensitivity, the interest of the public, the role of the relevant individual in public life, etc. Needless to say that this wording is not conducive to legal certainty.


Today’s ruling by the Court of Justice in Google Spain undoubtedly raised many eyebrows. Surprisingly it almost entirely goes against the Opinion of the Advocate General issued in June 2013. Nevertheless, it is still too early to draw general conclusions from the judgement. Even though at first glance it seems to considerably threaten freedom of expression/information interests, much of the wording seems to be very nuanced and limited in scope when looked at more closely. Additionally, the decision is entirely based on the existing legal framework (Directive 95/46). It is hard to predict how the judgment will interact with the future data protection Regulation, which is already being drafted..

The Right to be Forgotten – It’s about time, or is it?

[Brief summary of my presentation at the CPDP 2014 panel on “Timing the Right to be Forgotten”. Slides: See Below]

The panel took a really refreshing perspective on the Right to be Forgotten debate. So I was glad to take this opportunity to look more closely at what role ‘time’ actually plays in the legal framework relevant to the so-called ‘Right to be Forgotten’.

In short, the presentation aimed to identify some of the relevant legislations and case-law, with a particular focus on the general right to privacy and the data protection framework.

Terminlogical Issue – Over the past few years, the so-called ‘Right to be Forgotten’ seems to have been used as some sort of umbrella term to refer to different situations and different legal regimes (general right to privacy, right to personal portrayal, data protection, defamation, etc).

General Right to Privacy – When looked at in the context of the general right to privacy (8 ECHR), it is usually applied to shield individuals from being confronted with certain aspects of their past in a disproportionate, unfair or unreasonable way (classic example: ex-convict who is confronted with his/her past in the media, years after the facts). Because it is primarily invoked in situations where an individual’s personal life is publicly exposed, usually by the media, a careful balancing exercise with other fundamental rights will be imperative. One of the key criteria in making this balance will often be to look at how much time has passed. In the Österreichischer Rundfunk v Austria Case, for example, the ECtHR specified that the lapse of time since a conviction and release constitutes an important element in weighing an individual’s privacy interests over the public’s interest in publication. But, in another case, concerning the publication of a book by the private doctor of former French President Mitterand, the Court held that the lapse of time was an argument in favour of the public’s interests over the privacy and medical confidentiality protections of the ex-President.

Data Protection Law – When based on the data protection framework, the right to be forgotten – or rather right to erasure – seems to be more mechanical and straight-forward. At least in theory. Under the current Directive, the right can be invoked when the data processing “does not comply with the provisions of the Directive, in particular because of the incomplete or inaccurate nature of the data” (art.12). In other words, it looks like the data subject could invoke his/her right to erasure when the controller fails to fulfil its obligations or ignores data subjects’ rights. Keeping mind the concept of ‘Time’, three of the most relevant elements, probably are (1) the purpose specification and use limitation principle, (2) the need for a legitimate ground and (3) the data subject’s right to object.

The purpose specification principle actually constitutes some sort of benchmark against which the processing of personal data will be assessed over time. Besides having to be be specific and explicit, the purpose also has to be legitimate. It goes without saying that the legitimacy of the purpose of processing can evolve over time, depending on a variety of factors. On top of that, over time the personal data might become unnecessary, irrelevant or inadequate to achieve the original (or a compatible) purpose (for more information, check the Article 29WP Opinion 2/2013 on Purpose Limitation).

Secondly, the processing activities will permanently have to be tested against the legitimacy grounds in article 7 of the Directive. This is particularly relevant when the processing is based on the last legitimacy ground, which requires a careful balance to be made between all rights and interests at stake. These might, of course, evolve over time as well.

Thirdly, in principle the right to erasure can also be invoked when the data subject has successfully exercised his/her right to object. In order to exercise one’s right to object, it is necessary to put forward compelling and legitimate grounds (relating to one’s particular situation). It goes without saying that these grounds can include a variety of factors, among which time is one.

In the currently still pending Google Spain Case before the Court of Justice of the EU, for example, one of the primary arguments of the original plaintiff was the passing of time.The  National Court  explained that today, it is possible to create very detailed personal profiles in just a couple of clicks, with information that used to be difficult to find. The lack of territorial and temporal limitations to the dissemination of information constitutes a danger to the protection of personal data. The Court further specified that originally lawful and accurate personal data may become outdated overtime in the face of new events. Some of this information might actually generate social/professional/personal harm to the individual.

Finally, a few words about the draft Data Protection Regulation.  Article 17 on the Right to be Forgotten and to Erasure – already rebranded to the pre-existing right to erasure – specifically aims to give (back) some control to data subjects over their data. Without wanting to go into detail on this provision (which does not add that much to the existing regime, but rather emphasises existing rights and obligations), it is worth highlighting that the article does refer to the concept of ‘Time’ in paragraph 7. This provision stipulates that the controller should “implement mechanisms to ensure that the time limits established for the erasure of personal data […] are observed. The Regulation also requires these time limits are to be specified in the information provided to data subjects (art.14(1)(c).

Concluding. First of all, technology makes it ever more more easy to store and find old information. Just think of the digitisation of old archives, facial recognition, geo-tagging, etc. This trend evidently upsets an increasing amount of individuals. Depending on the relevant facts in each case, a number of legal frameworks might be used to request certain information to be removed. The general right to privacy seems to be particularly used in situations where private information is made public (again) by the media. From ECtHR (and national) case-law it can be deduced that the time-factor can either play in favour of removing the information (when deemed irrelevant, see Österreichischer Rundfunk v Austria Case) or in favour of keeping the information available (when entered in the public domain or when the information is of particular relevance in light of current events, see Aleksey Ovchinnikov v. Russia and Editions Plon v. France). In any case, it seems that from all legal frameworks that might be applicable, data protection law in particular constitutes an increasingly attractive route to take. Not only does it have a broad scope of application, but unlike most other regimes, it does not require falsehood, malicious intent or even widespread publicity

Regardless of what legal regime is used, it seems that in virtually all of these cases, a balance of interests and rights will have to be made. And in quite a few situations time will be a relevant factor to take into account. To give yet another recent example, it is worth referring to the Advocate General’s opinion in the DRI & Seitlinger Case before the Court of Justice (C‑293/12; C‑594/12), released just last month. In this Opinion, the AG explicitly claimed that the Data Retention Directive is incompatible with the Charter of Fundamental Rights. One of the reasons he put forward was that the Directive does not respect the principle of proportionality, in requiring data retention for up to two years. Although the Directive’s ultimate objective is perfectly legitimate, the AG argued, there is no justification for extending the data retention period anything beyond one year.

So, in short, it seems that the passing of time can be used to argue both ways – for or against removal. The importance of ‘time’ in determining the merits of removing information will be different in each individual case, but should not be overestimated either. Eventually, time will just be another factor in assessing the balance of rights and interests.