Comments on the Article 29 Working Party’s proposal of “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”

Nov 28, 2017

i

Download PDF

Dear Madam or Sir,

 

First of all we would like to express many thanks for the opportunity to provide comments on the “Guidelines on Automated individual decisionmaking and Profiling” (hereinafter “the Guidelines”). Spolek pro ochranu osobních údajů (Data Protection Association) is the largest organization bringing together professionals in the area of personal data protection and future personal data protection officers in the Czech Republic.

We are grateful to have been given the opportunity to review Article 29, the Working Party’s proposed Guidelines. However, we would like to respectfully suggest some additions and refinement proposals on the Guidelines.

We have especially considered the following issues:

  1. Scope of individual rating or classification during profiling,
  2. Definition of solely automated processing and scope of potential human intervention,
  3. Unclear interpretation of the “general prohibition“ in Article 22,
  4. Definition of “a decision based solely on automated processing, including profiling,

    which produces legal effects concerning him or her or similarly significantly affects him or her”,

  5. Necessity for entering into, or performance of, a contract.

Scope of individual rating or classification during profiling

Chapter I.A. of the Guidelines states: The GDPR says that profiling is automated processing of personal data for evaluating personal aspects, in particular to analyse or make predictions about individuals. Therefore simply assessing or classifying individuals based on characteristics such as their age, sex, and height could be considered profiling, regardless of any predictive purpose.”

We believe that merely assessing or classifying of an individual based on very simple characteristics such as age, gender and height should not be considered profiling.

GDPR defines profiling in Art. 4/2 as any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.

We assume that the intention of the legislator was not to include the process of mere assessing or classifying of an individual based on very simple characteristics such as age, gender and heightin the definition of profiling, since this information is a much more general category of data than “performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements. The approach outlined above would mean that almost any activity related to the processing of personal data would be considered as profiling. The definition of profiling itself would then be meaningless, since it would apply to almost all cases of processing. We take the view that the above mentioned assessing based on characteristics such as age, gender and height should be considered rather as basic segmentationthat should have been distinguished from profiling as such.

If the legislator intended to include the basic categories as “age, gender and height” in the definition of profiling, he would probably already have done so by including it in the above- mentioned demonstration, which highlights the categories of data to be given special consideration.

Solely automated processing and scope of potential human intervention

The Guidelines state their interpretation of the term “based solely on automated processingused in Art. 22: “Article 22(1) refers to decisions ‘based solely’ on automated processing. This means that there is no human involvement in the decision process.and The controller cannot avoid the Article 22 provisions by fabricating human involvement. For example, if someone routinely applies automatically generated profiles to individuals without any actual influence on the result, this would still be a decision based solely on automated processing. To qualify as human intervention, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture. It should be carried out by someone who has the authority and competence to change the decision. As part of the analysis, they should consider all the available input and output data.

…Read the whole text

We believe that the last sentence above (“As part of the analysis, they should consider all the available input and output data.) is an interpretation which is too extensive and which deviates unreasonably from the original wording of Article 22 GDPR.

We fully agree with the interpretation of WP 29 that the controller cannot avoid the provisions of Article 22 of the GDPR by fabricating human involvement, and that the supervision over an automated decision should not be only symbolic or simulated. However, if the Guidelines state that all inputs and outputs that are available should be considered, it will further increase the level of quality of the required human supervision. Taking into account all inputs and outputs available asks for expert assessment of all the circumstances of the case. Contrary, routine human decision-making in everyday situations may not always be based on an assessment of all inputs and outputs that are available, rather on assessment of limited scope of inputs and outputs that is normally used under similar circumstances and that is in practice considered as sufficiently relevant. The processing of all inputs and outputs available would be further in contradiction to the principle of data minimization.

The interpretation of the “general prohibition” in Article 22

WP29 considers Article 22/1 as a general prohibition on solely automated individual decision with a significant effect. This means that the controller should not undertake the processing described in Article 22/1 unless one of the exceptions according to Article 22/2 applies. However, the text of Article 22/1 and the “prohibition” is unclear, especially when compared to other provisions of the GDPR, such as Article 9/1. We believe that the “general prohibition” under Article 22 is not as strict as interpreted by WP29. Therefore, an interpretation should be considered which takes into account the real position of Article 22. The outcome of automated processing and the resulting decision will in practice be very often in line with the real situation, and neither the subsequent review by a human will alter this. It is therefore unreasonable to insist that the human interference should always be included in the decision- making process, especially where the decision is not definitive and delay in issuing the opposite decision after a human involvement based on the objection of the data subject would not have any significant impact on the data subject.

Definition of “a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”

Examples of such processing are given in the Guidelines of some situations from the marketing and advertising world. The Guidelines state that there can be an obvious impact on particular social groups in such cases. We respectfully propose to remove these examples from the Guidelines due to their inappropriateness. We believe that this interpretation interferes with specific regulation of the advertising industry in both consumer protection law and in national law. E.g. Exposing people in financial difficulties to adverts for online gambling should be addressed through consumer protection law rather than personal data protection law, since such profiling does not deviate from common advertising profiling and there is no reason to apply Art. 22 restrictions in such cases.

The Guidelines also contain some other unclear and controversial examples. E.g. on page 10 there is an example of automatic decision-making in the form of automatic disconnection from mobile phone service for breach of contract because the data subject has forgotten to pay their bill before going on holiday. We would suggest to reconsider this example whether it is an example of automatic decision-making or not, as it is based primarily on the acting/omission of the data subject and the decision made by the controller is strictly determined by the behavior of the data subject. In addition, there could be interference with private law (i.e. whether and under what circumstances a fulfillment of the contract may be refused/suspended, which is primarily a question solved by civil law) and therefore the use of such examples should be carefully chosen and reconsidered in order to respect the principle of the coherence of the of law rules.

Necessity for entering into, or performance of, a contract

Although the wording of Article 22/2/a of GDPR is almost identical to Article 6/1/b) or f) we believe that the “necessity” under Article 22 should be assessed less strict and separately just from the point of view of automated decision-making under this Article. Consideration should therefore be given to real economic practice, especially in large enterprises with many thousands of (potential) customers and many running contracts, where customer rights in the process of negotiation/performance of a contract would be sufficiently protected by the mere possibility of an objection of data subjects and human intervention under Article 22/3.

We are really grateful for the opportunity to provide the above mentioned comments on the Guidelines.

Yours sincerely

JUDr. Vladan Rámiš, Ph.D.                            
Chairman of the Committee                          

Ing. Václav Mach
Vice Chairman of the Committee

Recent news

EFDPO Conference. 28 & 29 May 2024, Berlin

EFDPO Conference. 28 & 29 May 2024, Berlin

One of EFDPO’s goals is to create a European network of national associations for the exchange of information, experience and methods, and to improve the quality of training and professional practice.

read more
Position paper on GDPR Evaluation 2024

Position paper on GDPR Evaluation 2024

This paper highlights how, from the perspective of data protection practitioners, the business sector –
particularly small and medium-sized enterprises (SMEs) – can be better supported in meeting data
protection requirements within the context of increasing digitization.

read more