CJEU’s first ruling on Article 22 GDPR: ‘credit scoring’ is an automated decision

Summary

On 7 December 2023, the Court of Justice of the European Union (CJEU) gave judgment in Case 634/21 SCHUFA, the first ruling to determine the scope of the right not to be subjected to automated decision making (ADM) in Article 22 of the General Data Protection Regulation (GDPR).   

Every data lawyer (aren’t we all that now) should know about it whatever other field of law they work in since ADM is now an established part of society from the financial sector through to the employment field to local government functions.

Article 22 prohibits certain forms of ADM but only where a decision is “based solely on automated processing”.  This qualification has long been a concern.  Did it mean, for example, that this provision did not operate where there a “human in the loop” but their involvement was only “light touch” ?

The CJEU has taken a  restrictive approach finding that Article 22 applies whenever ADM generated information significantly influences a decision. And although post Brexit the GDPR has been replaced by the UK GDPR, the material provisions in Article 22 remain the same so this judgment – though not a precedent - certainly has very significant persuasive value. 

In this blog we highlight the facts before the CJEU, explain its ruling and then discuss the case’s implications.

The Facts

SCHUFA is a German credit reference agency: It is a private company which provides third party contractual partners with information on the creditworthiness of individuals. It does so by automatically generating a prediction of future behaviour of a person (a ‘score’), such as the likelihood of them repaying a loan. Such a score is the result of applying algorithms to data relating to the relevant individual. This known as “profiling” and, for many sectors of society, it is controversial.

OQ brought the claim.  She was an individual who was refused a loan by a third party as a result of a negative score which SCHUFA produced. OQ applied for SCHUFA to send her information on the personal data used and she applied to erase some of the data which was allegedly incorrect.

SCHUFA informed OQ of her score and outlined, in broad terms, the methods for calculating the scores. However, referring to trade secrecy, it refused to disclose the various elements taken into account for the purposes of that calculation and their weighting.

OQ asked the competent German authority to order SCHUFA to grant her request for access to information and erasure. The German authority refused and OQ appealed.

The appeal court referred questions to the CJEU.

It asked whether the automated generation of a probability value concerning the ability of a data subject to service a loan in the future, was ADM within the meaning of Article 22(1) GDPR, if that value was transmitted by the controller to a third-party controller and the latter draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject.

Indeed, the court noted that “where a loan application is sent by a consumer to a bank, an insufficient probability value leads, in almost all cases, to the refusal of that bank to grant the loan applied for” (paragraph 48).

The significance of that last element – the credit score and it’s weight  – was particularly significant for the judgment. 

The Legal Framework

Let’s turn now to the provisions of Article 22 GDPR and note the focus on the words “a decision based solely on automated processing” (highlighted in bold below).

Under Article 22 of the GDPR, entitled ‘Automated individual decision-making, including profiling’:

‘1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

2. Paragraph 1 shall not apply if the decision:

(a)   is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b)   is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or

(c)   is based on the data subject’s explicit consent.  …”

For completeness, we also highlight Article 4(4) GDPR which defines profiling as:

any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”      

The Decision

The CJEU highlighted that to constitute ADM, Article 22(1) required three cumulative conditions to be met, namely:

(1)   there must be a ‘decision’; and

(2)   that decision must be ‘based solely on automated processing, including profiling’; and

(3)   it must produce ‘legal effects concerning [the interested party]’ or ‘similarly significantly [affect] him or her’.

The Court determined that in this instance all three conditions were met.  

As to condition (1), the concept of decision should have “broad scope” (paragraph 45) and was “capable of including a number of acts which may affect the data subject in many ways” (paragraph 46).

As to (2), it was clear that SCHUFA used profiling to produce its score (paragraph 47).

There is nothing surprising about these two conclusions.

In relation (3), the court having noted that “where a loan application is sent by a consumer to a bank, an insufficient probability value leads, in almost all cases, to the refusal of that bank to grant the loan applied for” concluded that condition (3) was also fulfilled (paragraph 48).  The CJEU went on to say that:

“… it must be stated that the third condition to which the application of Article 22(1) of the GDPR is subject is also fulfilled, since a probability value such as that at issue in the main proceedings affects, at the very least, the data subject significantly.

 It follows that, in circumstances such as those at issue in the main proceedings, in which the probability value established by a credit information agency and communicated to a bank plays a determining role in the granting of credit, the establishment of that value must be qualified in itself as a decision producing vis-à-vis a data subject ‘legal effects concerning him or her or similarly significantly [affecting] him or her’ within the meaning of Article 22(1) of the GDPR.” (Emphasis added, paragraphs 49 & 50)

This meant that the credit scoring system which SCHUFA deployed was in principle contrary to the GDPR since it was “a decision based solely on automated processing”.

The decision is consistent with the Opinion of Advocate General Pikamäe. He discussed the relationship between the credit score and the loan decision and recognised how significant this score would be.  His proposed solution was that where “personal data of the data subject, is transmitted by the controller to a third-party controller and the latter, in accordance with consistent practice, draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject” Article 22 was engaged. 

Comment

Beware!  The CJEU in refusing a narrow approach to what constitutes ADM has recognised how much reliance is now placed on ADM information and how little humans really interrogate its outtakes.  Its approach was to focus on the  effect of the automation to determine whether the prohibitions in Article 22 apply.  We think that a UK court would – and should – take a similar approach.

The judgment is therefore an important indicator that parties who sit within the value chain and provide ‘advisory scores’ may well fall foul of the UK GDPR (unless an exception applies) even if they do not ultimately determine the use or outcome of their scores.

While the focus of the judgment was on credit scores, the reasoning behind this decision would be applicable across a variety of areas, such as  health. For example, it may be a score determining whether to offer a person health insurance (or indeed at what cost).

Of course the current text of the Data Protection and Digital Information Bill (DPDIB) proposes to amend Article 22 (see clause 14) and replace it with three new Articles 22A – C.  However we think that the lesson of this case should not be lost.  The prohibition on ADM is an important right in the world where data determines so much about our lives. The proposed Article 22A explains that “a decision is based solely on automated processing if there is no meaningful human involvement in the taking of the decision”.  This approach comes from the Article 29 Working Party guidelines on automated decision-making which suggest that a decision will be “based solely on automated processing” where there is no “meaningful” human involvement in the decision making process as noted in the Explanatory Memorandum to the Data Protection Act 2018 see[113].

The CJEU obviously agree and explain what that really means! 

This blog was written by Robin Allen KC, Dee Masters, and Grace Corby from Cloisters chambers. They specialise in the law at the interplay between artificial intelligence, discrimination, data and human rights.  They are currently drafting an Bill for the TUC to regulate artificial intelligence (which may include ADM) in the employment field.

Previous
Previous

Hostage to fortune? COT3 drafting following the decision in Ajaz v Homerton

Next
Next

Catherine Casserley provides evidence to the House of Lords Public Services Committee on inquiry into careers support for disabled young people