Author: Iurie Cojocaru
As we all know, “personal data” represents information related to an identified or identifiable natural person.
So how do you identify someone? By name or through an image, one would answer. But what happens when we do not know the name and the image of the person but we are able to track, analyze and influence the person’s behavior? Would we not limit too much the scope of the data protection legislation if we deny protection for those whose names and faces are unknown, but whose behavior is being monitored? Would we not exclude situations in which the data protection legislation must legitimately apply?
On the other hand, a too broad scope of the notion of “personal data” may place an excessive burden on controllers and processors, who would need to take additional security measures and observe additional requirements for information which has not been treated as “personal data” before.
In September 2023, the Court of Justice of the European Union (CJEU) will start the hearings in the case IAB Europe (C-604/22) which may shed light on the interpretation of the notion of “personal data”.
In this case, a Belgian court of law asked the CJEU, amongst others, if “a character string that captures the preferences on an Internet user in connection with the processing of his or her personal data in a structured and machine-readable manner constitutes personal data”, by itself or in combination with an IP address.
The character string mentioned above is called the “TC String” (from “Transparency and Consent String”) by IAB Europe (a European-level association for the digital marketing and advertising ecosystem) and consists of a combination of letters, numbers, and other characters.
The TC String is meant to capture in a structured and automated way the preferences of a user when he or she visits a website or an app of a publisher (i.e., a company that owns a website or an app with advertising space) that has integrated a consent management platform. This refers in particular to the capturing of consent or the refusal to consent to the processing of personal data for marketing and other purposes, to sharing personal data with third parties (adtech vendors) and the exercise of the right to object. Vendors decipher the TC String to determine if they have the right to process the user’s personal data for the specified purposes.
Debates on the interpretation of the notion of “personal data” are not new. Back in 2016, even before GDPR became applicable, the CJEU ruled, in the Breyer case (C‑582/14) , that the dynamic IP address registered by an online media services provider that makes a website accessible to the public (in that case, the German State) is a personal data in relation to that provider, if the latter has the legal means to identify individuals with additional data that the internet provider has on those individuals.
In that case, the German public authorities collected through logfiles information about access operations on websites. Such information included the name of the internet page or of the file which was to be accessed, the words included in the search boxes, the access time, the amount of information transferred, confirmation that the access was successful and the IP address of the PC from which the access was attempted.
We may deduce that, in the Court’s view, the dynamic IP address, by itself, does not constitute personal data. It is necessary to correlate it with other information (i.e., name) held by the internet provider, and thus an online behavioral profile that lacks other information would not be sufficient to be considered personal data.
The former Art. 29 Working Party (WP), replaced by the European Data Protection Board (EDPB), a European body reuniting representatives of EU data protection authorities, seemed to have a different approach. In its Opinion 2/2010 on online behavioral advertising, WP held, in the context of tracking cookies, that “behavioural advertising is based on the use of identifiers that enable the creation of very detailed user profiles which, in most cases, will be deemed personal data.”
So, if according to WP the information collected through tracking cookies represents personal data, why is tracking information on the online behavior not personal data, except when correlated with the additional details held by the internet provider (as ruled by the CJEU in Breyer case)?
The discussion is complicated even more by the notion of “singling out” used in recital (26) of the GDPR.
The EU Regulation states: “To determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly.”
WP, in its Opinion 4/2007 on the concept of personal data is very categorical about this. “At this point, it should be noted that, while identification through the name is the most common occurrence in practice, a name may itself not be necessary in all cases to identify an individual. This may happen when other “identifiers” are used to single someone out. Indeed, computerised files registering personal data usually assign a unique identifier to the persons registered, in order to avoid confusion between two persons in the file. Also on the Web, web traffic surveillance tools make it easy to identify the behaviour of a machine and, behind the machine, that of its user. Thus, the individual’s personality is pieced together in order to attribute certain decisions to him or her. Without even enquiring about the name and address of the individual it is possible to categorise this person on the basis of socio-economic, psychological, philosophical or other criteria and attribute certain decisions to him or her since the individual’s contact point (a computer) no longer necessarily requires the disclosure of his or her identity in the narrow sense. In other words, the possibility of identifying an individual no longer necessarily means the ability to find out his or her name. The definition of personal data reflects this fact.”
The Irish data protection authority, in its 2019 Guidance on Anonymization and Pseudonymisation, gives further details. According to such authority, singling out “occurs where it is possible to distinguish the data relating to one individual from all other information in a dataset. This may be because information relating to one individual has a unique value; such in a data set which records the height of individuals, where only one person is 190cm tall, that individual is singled out. It might also occur if different data related to the same individuals is connected in the data set and one individual has a unique combination of values. For example, there might be only one individual in a dataset who is 160cm tall and was born in 1990, even though there are many others who share either the height or year of birth.”
So, considering the above, if I have a chart with the name, date of birth and height, and I replace the names with codes and I share this table to another company, does it mean that the recipient will receive personal data?
Hold on. We have another development in this story.
Several years ago, during a bank procedure, the Single Resolution Board (SRB), the central resolution authority in the EU Banking Union, received a number of comments from the shareholders and creditors of a EU bank. SRB filtered the comments received and allocated them alphanumeric codes instead of names. Afterwards, SRB transmitted the comments bearing alphanumeric codes instead of names to an external valuator. Based on the alphanumeric codes, only SRB could link the comments to the identities of commentators. The valuator did not have access to the initial database.
Several shareholders and creditors filed complaints with the EDPS (European Data Protection Supervisor), which is the EDPB’s brother dealing with the data processing of EU institutions and bodies. The EDPS launched a verification and decided that the information transmitted by the SRB to the external valuator constitutes personal data and that the SRB violated its data protection obligations (the EDPB indicated, amongst others, that the SRB had not provided the individuals an adequate privacy notice in this context).
The case was brought before the European General Court (EGC – a court within the CJEU system hearing the actions taken against EU institutions) which pronounced its judgement in April 2023.
The SRB stated in court that the external valuator did not have the possibility to re-identify the data subjects who submitted comments. On the other hand, the EDPS argued that the data that had been transmitted was only pseudonymized, so still personal data. According to the EDPS, it is not necessary to determine whether the persons who made the comments were re-identifiable by the external valuator. Pseudonymized data remains so even when transmitted to another party that does not have additional information.
To make a decision, the EGC referred to the Breyer case mentioned above, indicating that in order to decide if the information transmitted to the external valuator constitutes personal data, it is necessary to put oneself in the position of such valuator to determine whether the transmitted information relates to “identifiable persons”.
Thus, the EGC compared the situation of the valuator in the SRB vs. EDPS case with the online media services provider in the Breyer case, and argued that the received comments did not constitute information relating to an “identified natural person” in so far as the alphanumeric codes appearing in comments did not allow for the identification of the person who commented. The EGC continued stating that the SRB’s situation is similar to that of the internet provider in the Breyer case, holding additional information in order to identify the individuals. Considering all this, the EGC decided that it was for the EDPS to determine whether it was possible for the valuator to re-identify the authors of the received comments. In other words, the EDPS had to determine if the valuator had the possibility to combine the information received with the additional information held by SRB. Therefore, since the EDPS did not investigate this, it could not conclude that the information received by the valuator constituted information on an “identifiable natural person”. Consequently, the EDPS decision was annulled by the EGC. The EDPS lodged an appeal against this ruling.
It appears that we have one team at EU and national levels that believes it is sufficient to single out a person from a group to consider that person identifiable, even if we are not reasonably likely to find out the person’s identity. The other team says that the possibility of practical identification is of utmost importance.
The problem is that controllers and processors are caught in the middle of this debate and must be able to understand how to apply data protection rules in a way that complies with the expectations of all these different courts, authorities and bodies that at this moment do not have a single, unitary approach.
We hope that the CJEU ruling in the IAB Europe case which is scheduled to begin in September 2023, will help to create a uniform approach to this issue, although, based on our past experience, we have seen that the EU court tends to create more complications rather than to offer effective solutions to problems.