We received unwanted data. How do we handle them?

05.04.2021 - Usually, we talk about the accidental loss of data, one of the most common concerns of both controllers and processors when thinking about data breaches.

European telecommunication network connected over Europe, France, Germany, UK, Italy, concept about internet and global communication technology for finance, blockchain or IoT, elements from NASA

Iurie Cojocaru and Roxana Ionescu

However, there may be cases when not the loss is accidental, but the collection of data. Actually, it is not always an accident. Sometimes, the data subject provides more data than you would expect. Some would call it “unpredictable” or “incidental” collection of data.

The big question is whether the controller must ensure the formalities for those data which were collected in such an accidental-unpredictable-incidental manner.

Of course, in the good old days, one would try to build an argument by appealing to the four building blocks of Art. 29 Working Party Opinion 4 of 20 June 2007 on the concept of personal data (Personal Data Opinion).

To put it briefly, the Working Party stressed out in 2007 that in order for an information to be qualified as “personal data”, it must meet four cumulative criteria, one of which being the “relating to” criterion.

The “relating to” criterion implies that in order for an information to relate to an individual (let’s call him X), that information must have at least one of the following elements:

  • information must be given about X (the “content” element);
  • information must be used with the purpose to evaluate, treat in a certain way or influence the status or behavior of X. (the “purpose” element);
  • the use of information must be likely to have an impact on rights and interests of X (the “result” element).

If none of the three elements are met, the information is not “personal data” with regard to X, because it does not “relate” to X.

This mechanism may have been used in order to justify that certain incidentally collected information does not represent personal data.

Of course, the above explanation of the “relating to” building block is an oversimplified summary of the Personal Data Opinion optic on the matter. In fact, the wording of the Working Party in its Personal Data Opinion is so elaborated and interpretable, that compared to it, the answers of Oracle of Delphi may seem “transparent, intelligible and easily accessible” while “using clear and plain language”.

Luckily, the European Data Protection Board (the Board), the successor of the Art. 29 Working Party, has come up with a more flexible and practical approach, even if Personal Data Opinion is still in force.

So, how does the Board see the incidentally collected data?

The recent Board Guidelines 2 of 9 March 2021 on Virtual Voice Assistants (VVA Guidelines) offers the following short answer:

“If a data controller becomes aware (e.g. due to quality review processes) of the accidental collection of personal data, they should verify that there is a valid legal basis for each purpose of processing of such data. Otherwise, the accidentally collected data should be deleted.”

At point 30 of the VVA Guidelines, EDPB develops this idea, by making reference to the case of VVA service which is accidentally activated by an individual unwillingly using a valid wake-up expression. Since a valid consent would be necessary in this case and while the accidental use of the wake-up expression cannot be interpreted as “unambiguous indication of the data subject’s wishes” (one of the conditions for a valid consent), the collection of data is made without consent. Therefore, the controller must check if a valid ground of processing is incident. If not, the accidentally collected data must be deleted.

When looking from a different angle, the EDPB position leads to the conclusion that, at least for a while, you may store certain personal data which are accidentally collected, without undergoing all the needed formalities with regard to such data.

That may sound as a relief for a number of controllers. But for how long may the data be kept in such a way?

The French Data Protection Authority (CNIL) says, in its guidance on chatbots issued on 19 February 2021 (CNIL Chatbot Guidance), that the controllers must implement purging systems in order to delete the “unpredictable” (in French: “imprévisible”) data, which were provided by users while interacting with the chatbots.

In this context, the purging must take place “immediately or at least periodically”. However, it does not indicate what “periodically” means, leaving the assessment upon the controllers.

CNIL, in line with the EDPB standpoint in its VVA Guidelines, sets forth that the controllers do not have to obtain in advance the consent of users just to cover the possibility of collecting unpredictable sensitive data.

Another relatively recent text which treats the collection of unpredictable sensitive data is the EDPB Guidelines 3 of 10 July 2019 on processing of personal data through video (EDPB Video Surveillance Guidelines):

“Video footage showing a data subject wearing glasses or using a wheel chair are not per se considered to be special categories of personal data. However, if the video footage is processed to deduce special categories of data Article 9 applies.” (points 61-62 of the EDPB Video Surveillance Guidelines).

Otherwise said, EDPB Video Surveillance Guidelines indicate that while the video surveillance systems may collect certain data which may reveal information of a highly personal nature and even special categories of data, it depends on the purpose of using such data. If the collected data are used to deduce sensitive data under Article 9 GDPR, then the requirements of such Article 9 would apply. If not, the controller will be able to skip the Article 9-related requirements, but the rest of rules applicable to non-sensitive data would still apply.

The data minimization principle is one which must be observed by the controllers even if they do not intend to capture sensitive data. According to EDPB, an application of this principle is to try to minimize the risk of capturing images revealing sensitive data. How would you do this with your CCTV system? Well, probably first of all you will have to make sure that the cameras are not placed in areas where such sensitive data are revealed most often (e.g., dressing room, medical check room).

Getting back to the CNIL Chatbot Guidance, the French authority requires from the controllers, also in the application of the data minimization principle, to inform in advance the chatbot users that they have to refrain from providing sensitive data. This approach is usually adopted in practice also in other fields (e.g., websites which have a “write to us” section). In any case, the controller cannot get away only with this announcement. As previously said, CNIL requires to ensure the purging of unpredictable data, even if the prior announcement on the scope of data has been provided to users.

While the aforesaid EDPB and CNIL approach may be a helping hand for controllers struggling to meet the GDPR rules, there are still certain aspects left unanswered.

First of all, based on the CNIL provision on periodical purging, how often must the periodical purging of incidental data take place? Once in several days? Once in several weeks? Once in several months? Depending on which factors such duration is set?

And what happens with the data between the moment of collection/identification and the moment of deletion? The controller may not need to obtain the consent for the incidentally collected sensitive data, but what about the rest of the formalities? Which of them are to be complied with and which not? For example, if someone exercises the access right during this period, are those data covered in the scope of the request? It is an interesting topic and we look forward to new developments to follow.

And in any case, we hope to see a new, flexible and practical EDPB guidance on personal data, which would replace the 2007 Personal Data Opinion and which may offer more clarity to controllers in qualification and processing of personal data.

For now, we leave you with brief suggestions on how to start dealing with incidental personal data:

  • make sure your internal procedures and processes consider the possibility of receiving unwanted personal data;
  • assess the situations when such incidental personal data collection may occur;
  • identify if there are means to discourage the provision of unwanted personal data;
  • define purging measures for the incidental personal data, including intervals for implementation (even a broader interval is better than none at all, if the authority comes calling).

Statistics