CJEU rules on what constitutes “automated decision-making” under the GDPR
The questions
Does the production of a credit score constitute “automated decision-making” for the purposes of Article 22 of the EU General Data Protection Regulation (GDPR)? What wider impact will this have for AI technologies?
The key takeaway
In its decision in OQ v Land Hessen, SCHUFA Holding AG (Case C‑634/21), the Court of Justice of the European Union (CJEU) found that, where a credit score is transferred to a third party (eg a bank) and the third party relies heavily on it to establish, implement or terminate a contractual relationship (eg the granting of a loan), then the processing used to produce that credit score constitutes “automated decision-making” for the purposes of Article 22 GDPR. The decision has significant implication for algorithmic decision-making, including its use within AI technologies.
The background
This case was initially referred to the CJEU by the Administrative Court of Wiesbaden in Germany following an appeal by a data subject who was refused a loan by her bank based on her negative credit score.
The negative credit score was produced by SCHUFA (a German credit referencing agency) and provided to the data subject’s bank as part of the data subject’s loan application process. Following this, the data subject made an application to SCHUFA to access the personal data which was used to produce her credit score and to erase certain personal data which SCHUFA held about her which was allegedly incorrect.
In response, SCHUFA informed the data subject of her credit score and broadly how it was calculated. However, referring to trade secrecy, SCHUFA refused to disclose the factors which it considered in the production of the data subject’s credit score and how each factor was weighted. Further, SCHUFA stated it was its contractual partners (including the data subject’s bank) that actually made the decisions based on the credit scores which SCHUFA produced for them.
The development
Under Article 22 GDPR, data subjects have the right not to be subjected to a decision based solely on automated processing of personal data, including profiling, which produces legal or similarly significant effects for them. Further, data subjects also have the right to receive (under Articles 13 and 14 GDPR), and request (under Article 15 GDPR), meaningful information about the logic involved, as well as the significance, and the envisaged consequences, of the processing of their personal data.
Given the above, in its decision, the CJEU found that, in reality, the credit score produced by SCHUFA was a decision in and of itself which produced significant effects for the data subject. This is because the negative credit score produced by SCHUFA ultimately determined whether the data subject would be granted a loan by her bank or not.
Further, the CJEU was influenced in this decision by the fact that, to find otherwise, would lead to a gap in the legal protection afforded to the data subject under the GDPR. This is because, if SCHUFA was found to not be engaged in “automated decision-making”, then the data subject would not be able to enforce her rights about the decision against SCHUFA because it would not be the organisation making the decision, and then the bank would not be able to provide the information about the decision to the data subject because it would not have that information (which naturally SCHUFA would have).
Why is this important?
The CJEU’s ruling is significant because, beyond its immediate relevance to those involved in producing or obtaining credit scores, the decision may also have wider impact as organisations increasingly contract with service providers to support algorithmic decision-making, including within AI technologies. It means that organisations which provide automatic calculations or processes to their customers to assist them in making decisions which have a legal, or similarly significant effect for data subjects, may be caught by Article 22 GDPR even if they do not make the final decision themselves.
Any practical tips?
Notwithstanding that the CJEU’s ruling is not binding on the UK, it remains a persuasive authority and is likely to inform how the UK Courts and the UK’s Information Commissioner’s Office deal with claims from data subjects in relation to automated decision-making processes in the future.
As such, organisations which provide automatic calculations or processes to their customers to assist them in making decisions which have a legal, or similarly significant effect for data subjects should ensure that, where required, they are in a position to adequately respond to data subjects’ requests for information, and to review the information which they automatically produce to inform their customers’ decision-making processes. It follows that the implications for those planning on using AI technologies cannot be emphasised enough.
Spring 2024
Stay connected and subscribe to our latest insights and views
Subscribe Here