Legal Shift: A New Era for Third-Party Credit Scores?
On December 7, 2023, the Court of Justice of the European Union (CJEU) delivered a groundbreaking judgment against SCHUFA, a German credit information agency. The CJEU determined that automatically calculated credit scores fall under automated decision-making, as defined by the EU General Data Protection Regulation (GDPR). Moreover, the CJEU declared that the GDPR establishes a fundamental prohibition on automated decision-making based on personal data.
This ruling has significant implications for the current practice wherein credit information agencies calculate credit scores for individuals and provide them to banks for use in the credit-granting process.
Given the vital role credit information agencies play in safeguarding banks from credit losses, it is imperative to analyse the repercussions of this judgment. The question arises: What are the potential consequences if credit scores, in their current form, can no longer be made available to banks?
Background – what is automated decision-making?
The concept of automated decision-making was created in EU law in 1995 by means of the Data Protection Directive (the ‘DPD’). EU Member-States transposed the Article 15 DPD into their national legal systems in considerably different ways. While some countries, such as Belgium, framed their national automated decision-making provisions as a qualifying automated decision-making prohibition, other countries such as Sweden, saw the provision as requiring Member States to grant individuals a right to opt-out from automated decision-making.
In May 2018, Article 15 of the DPD was replaced by Article 22 of the General Data Protection Regulation (the ‘GDPR’). For GDPR’s automated decision-making provision to apply, there are three cumulative conditions that need to be met by the processing of personal data underlying the automated decision-making:
- the processing needs to underpin a decision.
- based solely on automated processing or profiling; and
- that has legal or similarly significant effects on an individual.
Like all other provisions of EU law, it is the Court of Justice of the Union (the ‘CJEU’) that has ultimate authority to interpret the GDPR. The issue of the nature of automated decision making — whether it is a principal prohibition (with exceptions) or a right for individuals to exercise in order to be effective — have been highly controversial since its inception in 1995.
What is a credit score?
A credit score is a probability assessment derived from personal data associated with an individual, specifically analysing their capacity to fulfil future payment obligations. Subsequently, this credit score is conveyed to a third party, such as a lender, credit institution, or bank. The third party utilises this probability value, often in conjunction with additional parameters, to make decisions regarding the approval, modification, or termination of a loan arrangement with the individual.
Credit scores and credit information are governed by national provisions enacted by the legislatures of the Member States – such as for example the Swedish Credit Information Act (1973:1173). Any national provisions on credit information are enacted within the framework of the GDPR. The reason being that credit information relating to natural persons is personal data. The national legislatures only have the mandate to regulate the use of personal data – and thus, credit information – if the GDPR allows for such provisions.
To summarise, GDPR allows the individual Member States a certain level of freedom in weighing the different interest involved. The interests being the legitimate interest of lenders to protect against credit loss, as well as the interest of borrowers to be granted the requested credit. At a societal level, the public interest in consumer protection and a stable economy shall be taken into account, along with the fundamental right to privacy and data protection.
Are credit scores that are automatically calculated considered decisions under the GDPR?
The question laid before the CJEU was whether the assignment of automatically calculated credit scores by a credit information agency constitutes automated decision-making. Thus, the question before the CJEU was not whether the use of the credit score to issue a credit constitutes automated decision-making. The question was whether the creation of the credit score by the credit information agency constitutes automated decision-making.
The credit information agency industry has argued that even a bad credit score, which would certainly prevent a person from concluding a large number of contracts (such as loans, insurance, rent or electricity supply contracts), is not a “negative decision”. According to the industry, the final decision is made by the lender, or other third-party, using the credit score. This interpretation has also been made by multiple national legislatures when implementing national provisions on credit information. For example, in the preparatory works of the Swedish Credit Information Act it is clearly stated that the automated processes of credit information agencies do not constitute automated decision-making.
The CJEU did not share this interpretation. In the opinion of the CJEU, the concept of a decision should be given an extensive interpretation. Recital 71 of the GDPR provided the CJEU with useful examples of such decisions: automatically refusing online credit or job applications.
What the CJEU identified as a key problem with the credit agency industry’s argument was a ‘caught in the middle’ scenario. The credit score is created by an algorithm. The logic of the algorithm constitutes trade secrets of the credit information agency and is not disclosed alongside the credit score. To comply with the GDPR, the data controller, in this context the credit information agency, shall provide meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject, in this context the lender. This information cannot be provided by the lender using the score since the Credit Information Agency usually do not share this information with the users of the credit score. In Sweden, Klarna Bank was issued an administrative fine of €750 000 by the Swedish Data Protection Authority in part because of the failure to provide such meaningful information.
In conclusion, there was a risk that the individual cannot receive meaningful information about the logic involved nor challenge the credit score. In the view of the CJEU, this would render the individuals’ rights under the GDPR ineffective. Such an interpretation could not be approved by the CJEU. Thus, in the end, it was not the wording of Article 22 GDPR that formed the closing argument for the conclusion. Rather, the CJEU applied a theological method of interpretation taking into account the provision’s context and the objectives and purpose pursued by the GDPR.
It should however be noted that the CJEU based its judgement on the predisposed assumption that the lender “draws strongly” on the credit score in its decision. Thus, if the lender does not draw strongly on the credit score, there is a principal argument to be made that the credit score does not longer constitute a “decision” and that Article 22 GDPR does not apply.
In summary, the CJEU deemed that there is a risk where individuals might be deprived of meaningful insights into the logic behind credit score determinations, impeding their ability to challenge the assigned scores. According to the CJEU, such a situation would undermine the effectiveness of individuals’ rights under the GDPR. Importantly, the final argument for the conclusion did not hinge on the specific wording of Article 22 GDPR. Instead, the CJEU applied a contextual and purpose-driven interpretative approach, considering the provision’s context and the overarching objectives of the GDPR.
It is noteworthy, however, that the CJEU’s judgment assumed a significant reliance by the lender on the credit score in its decision-making process. Consequently, if the lender does not heavily depend on the credit score, a plausible argument arises that the credit score may no longer be deemed a decisive “decision,” and, therefore, Article 22 GDPR may not be applicable.
The GDPR is a principal prohibition of automated decision-making
The first conclusion of the CJEU was that the assignment of automatically calculated credit scores by credit information agency constitutes automated decision-making.
The second conclusion was that the GDPR qualifies as a principial prohibition of automated decision-making. As a consequence, for automated decision-making to be lawful, an exception in Article 22(2) of the GDPR shall apply.
Although not stated explicitly by the CJEU, EU law is constructed in such a way that any exceptions shall be construed restrictively while the general rule (the principle) shall be construed in a broad manner. Therefore, the establishment of principal prohibition of automated decision-making means that Article 22(2) shall be construed with caution, thus narrowing the possibilities of using automated decision making.
There are three exceptions laid down in Article 22(2) of the GDPR:
- is necessary for entering into, or performance of, a contract between the data subject and a data controller;
- is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
- is based on the data subject’s explicit consent.
The CJEU did not rule which exception that was applicable for the assignment of automatically calculated credit scores. This question is to be assessed by the referring German court.
The key question in this context of automatically calculated credit scores is whether the credit score is necessary for entering into, or performance of, a contract between the data subject and a data controller. Note that the automated decision-making shall be necessary for entering into, or performance of, a contract between the data subject and a data controller, and not necessarily the data controller for the credit model. How this exception shall be construed will be of great interest moving forward since the wording differs between the different language versions of the GDPR. The alternative is the requirement of individual’s consent.
Anslysis and conclusions
The SCHUFA I-judgement is truly a landmark ruling creating a principal prohibition of automatically calculated credit scores by credit information agencies. This judgement will have policy implications, legal implications as well as potential practical challenges for lenders.
The policy implications are:
- The regulatory risk for credit information agencies have increased greatly overnight and to different extent in the different Member States. The regulatory risk equals increased risk for liability. Expect the cost for this regulatory risk to be transferred over time to lenders and other customers, i.e. it is probable that data will become more expensive.
- This judgment is aimed at credit information agencies and not lenders such as banks. This is not to say that this judgement will not affect lenders. Note that a judgement from the CJEU clarifies how the law shall be construed. Thus, the court’s interpretation applies, for a lack of a better word, retroactively to credit scores and data before 7 December 2023. This interpretation is the law of the land of the entire EU.
- The GDPR allows for the Member States to enact law allowing automated decision-making under certain conditions. For example, Swedish Parliament could enact a provision in the Swedish Credit Information Act allowing for automated decision-making of credit scores. Thus, the ball is in the court of the national legislatures as well.
- Lenders should assess how its products or services integrate with the data from credit information agencies. What tools are available to ensure data legality – are agreements sufficient? Are monitoring or audits required? What level of indemnification is appropriate? What other data may be used for a credit decision?
A case-by-case analysis is required by lenders.
The legal implications are:
- Based on the GDPR, the explicit consent of natural persons to allow for automated decision making is the most likely option to allow for the creation of credit scores, without prejudice to the national provisions on credit information in each Member State.
- Lenders have the option to assess how the data provided by the credit information agencies are processed. If the lender does not “draw strongly” on the credit score – there’s an argument to be made that the credit score is not an automated decision.
- Lastly, a principal prohibition of automated decision-making will greatly affect the use of
AI systems in the financial services industry. This judgement will serve as the benchmark for all automated models, not only credit scores, on the topic of risks for natural persons, such as the risk of “de-risking” and discrimination. Remember that even with the upcoming AI Act, automated decision-making on the basis of personal data will be governed by the GDPR.
The potential impact on lenders or other third parties relying on credit scores is uncertain, where the largest risk is that credit-scores, as is, will no longer be provided by credit information agencies. The general advice is to establish a dialogue with the credit information agencies used to gather their view on the impact of the CJEU Ruling.
Until further legal rulings have clarified the GDPR requirements, which can take years, below summary can form a starting point for discussions and analysis at the users of credit scores.
- Recognise the increased regulatory risk for credit information agencies, potentially leading to changes in the process of retrieving credit scores.
- Anticipate a gradual transfer of regulatory risk to lenders, likely resulting in increased data costs.
- Understand the retroactive effect where the CJEU judgment’s retroactive impact on credit scores and data pre-dating 7 December 2023.
- Stay informed about potential changes to national provisions such as amendments to the Swedish Credit Information Act, influencing automated decision-making.
- Evaluate how products integrate with data from credit information agencies. What are the implications of various changes in the process of retrieving credit scores?
- Ensure agreements, monitoring, and audits are in place for data legality.
- Consider the impact of explicit consent for all credit scores.
Understand the judgment’s broader impact on AI systems in the financial industry, serving as a benchmark for assessing risks for natural persons.
 The origins of the legal concept can be traced back to a provision of French law from 1978. The French provision served as the progenitor of the EU Law concept.