Austrian court mandates greater transparency in automated credit decisions

Company must disclose meaningful details about algorithmic logic after Federal Administrative Court rules on a GDPR violation.

Modern courtroom with digital data overlays representing Austrian GDPR automated decision transparency ruling
Modern courtroom with digital data overlays representing Austrian GDPR automated decision transparency ruling

The Austrian Federal Administrative Court ruled on May 28, 2025, that a credit information agency violated GDPR requirements by failing to provide sufficient transparency about its automated decision-making processes. The case marks a significant implementation of February's European Court of Justice guidance on algorithmic transparency rights.

According to the court decision in case W108 2230691-1, the court partially upheld a data protection complaint filed on December 19, 2018. The ruling established that the credit agency, now operating under a different name following a merger completed on September 29, 2021, had breached Article 15 GDPR by providing inadequate information about processing purposes and automated decision-making logic.

The complainant had requested comprehensive information about personal data processing after being subjected to an automated credit assessment. While the credit agency provided a three-page response on November 5, 2018, the court determined this disclosure fell short of GDPR requirements in two critical areas.

First, the agency failed to provide sufficiently specific information about processing purposes under Article 15(1)(a) GDPR. The court found that general statements about "providing credit information" and "providing marketing information" lacked the precision required for compliance verification. According to the decision, the agency processed data from multiple sources, including address publishers and direct marketing companies, but failed to clearly indicate which specific data served which purposes.

The ruling emphasized that when processing serves multiple purposes, controllers must clarify which data categories support each specific objective. This becomes particularly important when certain data types face usage restrictions, as marketing information from address publishers can only be used for marketing purposes under Section 151(6) of the Austrian Trade Act.

Second, the court found violations of Article 15(1)(h) GDPR regarding automated decision-making transparency. The agency had denied conducting automated decision-making under Article 22 GDPR, claiming no such processes existed. However, the court determined that automated score calculation does constitute automated decision-making when third parties rely substantially on these scores for contract decisions.

The February 27, 2025 CJEU ruling in case C-203/22 established that automated generation of creditworthiness scores constitutes automated decision-making under Article 22 GDPR when these scores significantly influence whether third parties conclude, perform or terminate contracts. The Austrian court applied this precedent directly to the case at hand.

Under the new transparency requirements, the credit agency must explain the procedures and principles actually applied during automated processing. The court specified that companies must describe how personal data was used in automated decision-making processes in ways that enable data subjects to understand the logic involved.

The CJEU had clarified that meaningful information about automated decision-making logic must enable data subjects to effectively exercise their rights under Article 22(3) GDPR, including challenging automated decisions. This requires explanations that go beyond technical algorithm descriptions to provide practical understanding of decision-making processes.

The Austrian court rejected the credit agency's claims about trade secrets protecting algorithmic details. The agency had argued that revealing scoring methodology would compromise business secrets worthy of protection. However, the court noted that the agency failed to provide specific evidence about which information constituted legitimate trade secrets.

Following the CJEU guidance, the court explained that when controllers claim trade secret protection, they must submit allegedly protected information to competent authorities for evaluation. The supervisory authority or court must then balance competing rights and interests to determine appropriate disclosure levels. The credit agency had not followed this procedure despite repeated requests from the court.

The ruling revealed specific scoring information that had previously been undisclosed. The agency had provided scores of 2.02 and an "traffic light score" of 2 to requesting companies in 2016 and 2018, indicating "very good creditworthiness." These scores emerged only during court proceedings, contradicting earlier claims that no specific scoring data existed for the complainant.

The court ordered the credit agency to provide compliant information within four weeks or face execution proceedings. This must include sufficiently specific details about processing purposes under Article 15(1)(a) GDPR and meaningful information about automated decision-making logic under Article 15(1)(h) GDPR.

The decision comes during heightened scrutiny of automated decision-making across Europe. The UK recently modernized its data protection framework to address automated decision challenges, while studies show significant accuracy issues in AI-powered systems affecting marketing and business decisions.

For marketing professionals, this ruling reinforces the importance of transparency in algorithmic systems that affect consumers. The decision particularly impacts companies using automated scoring for advertising personalization, customer segmentation or conversion optimization. These systems must now provide meaningful explanations about decision-making logic when requested by data subjects.

The Austrian authority's enforcement approach contrasts with recent criticism of other European data protection authorities for insufficient GDPR enforcement. The court noted that the original data protection authority had failed to decide within the required six-month timeframe, leading to the case transfer to the Federal Administrative Court.

Industry experts suggest this ruling will influence automated decision-making practices across financial services, advertising technology and customer analytics platforms. Companies relying on algorithmic systems for customer-facing decisions should review their transparency procedures and prepare detailed explanations of processing logic for potential data subject requests.

The court emphasized that meaningful information about automated decision-making must enable data subjects to understand which personal data was used and how this influenced specific outcomes. This standard requires more detailed explanations than companies have typically provided in response to data subject access requests.

The ruling also addresses the intersection of GDPR transparency requirements with business confidentiality concerns. While the court acknowledged legitimate trade secret interests, it established that these cannot override fundamental data protection rights without proper judicial evaluation of competing interests.

Legal practitioners note the decision's potential implications for similar cases across the European Union. The direct application of CJEU guidance in C-203/22 demonstrates how national courts will interpret and enforce enhanced transparency requirements for automated decision-making systems.

Companies should expect increased scrutiny of algorithmic transparency practices as data protection authorities implement the CJEU guidance. The Austrian ruling provides concrete examples of inadequate disclosure practices and establishes specific requirements for compliant information provision under GDPR Article 15.

Timeline

  • December 19, 2018: Original data protection complaint filed against credit agency
  • November 5, 2018: Credit agency provides initial three-page response deemed insufficient
  • January 13, 2020: Complainant files administrative delay complaint after authority fails to decide
  • April 29, 2020: Austrian Data Protection Authority transfers case to Federal Administrative Court
  • September 29, 2021: Credit agency merges with another company, affecting case parties
  • February 11, 2022: Vienna Administrative Court submits preliminary questions to CJEU
  • July 1, 2024: Austrian Federal Administrative Court suspends proceedings pending CJEU decision
  • February 27, 2025: CJEU delivers guidance in case C-203/22 on automated decision-making transparency
  • May 28, 2025: Austrian Federal Administrative Court issues final ruling requiring enhanced disclosure

Summary

Who: Austrian Federal Administrative Court ruled against a credit information agency in a GDPR compliance case involving a data subject seeking transparency about automated decision-making.

What: The court found violations of GDPR Article 15 requirements for providing sufficient information about processing purposes and meaningful details about automated decision-making logic used in credit scoring.

When: The ruling was issued on May 28, 2025, following a case that began with a data protection complaint filed on December 19, 2018.

Where: The decision was made by the Austrian Federal Administrative Court (Bundesverwaltungsgericht) in Vienna, Austria, with case number W108 2230691-1.

Why: The court determined that the credit agency failed to provide adequate transparency about its automated scoring processes and processing purposes, violating data subjects' rights to understand how their personal data influences automated decisions affecting their access to financial services.

PPC Land explains

GDPR (General Data Protection Regulation): The European Union's comprehensive data protection law that became effective on May 25, 2018, establishing stringent requirements for how organizations process personal data. In this case, GDPR Article 15 provides data subjects with the right to obtain information about their personal data processing, including details about automated decision-making processes. The regulation requires controllers to provide transparent, accessible explanations that enable individuals to understand and challenge decisions affecting them.

Automated decision-making: Process where decisions are made entirely through technological means without meaningful human intervention, as defined under GDPR Article 22. The Austrian court determined that credit scoring systems constitute automated decision-making when third parties rely substantially on algorithmic outputs for contract decisions. This interpretation follows the February 2025 CJEU guidance establishing that automated generation of creditworthiness probabilities qualifies as automated decision-making under EU law.

Article 15: The GDPR provision granting data subjects comprehensive access rights to information about their personal data processing. Article 15(1)(a) requires controllers to disclose specific processing purposes, while Article 15(1)(h) mandates meaningful information about automated decision-making logic, scope, and intended effects. The Austrian ruling demonstrates how courts interpret these requirements to ensure data subjects can effectively exercise their rights to challenge algorithmic decisions.

Credit agency: Organizations that collect, analyze, and distribute information about individuals' and businesses' creditworthiness to help lenders assess financial risk. In this case, the Austrian credit information agency processed personal data from multiple sources to generate automated credit scores for third-party clients. The court found that such agencies must provide detailed explanations of their scoring methodologies when data subjects exercise their access rights under GDPR.

Trade secrets: Confidential business information that provides competitive advantages, protected under EU Directive 2016/943 on trade secret protection. The credit agency argued that revealing scoring algorithms would compromise protected trade secrets. However, the Austrian court established that trade secret claims cannot automatically override GDPR transparency requirements without proper judicial evaluation of competing interests between business confidentiality and individual data protection rights.

Processing purposes: The specific, legitimate reasons why personal data is collected and used, which must be clearly defined and communicated to data subjects under GDPR Article 5(1)(b). The court found the credit agency's general statements about "providing credit information" and "marketing information" insufficient because they failed to specify which data categories served which purposes. This distinction becomes crucial when different data sources have usage restrictions under national laws.

Federal Administrative Court: Austria's specialized administrative court system (Bundesverwaltungsgericht) responsible for reviewing decisions by administrative authorities, including data protection enforcement actions. The court gained jurisdiction in this case after the Austrian Data Protection Authority failed to decide within the statutory six-month deadline. This transfer mechanism ensures data subjects can obtain judicial resolution even when regulatory authorities experience procedural delays.

CJEU (Court of Justice of the European Union): The EU's highest court responsible for interpreting European law and ensuring uniform application across member states. The February 27, 2025 ruling in case C-203/22 established binding precedent requiring enhanced transparency in automated decision-making systems. National courts like Austria's Federal Administrative Court must apply CJEU interpretations when deciding similar cases, creating consistent enforcement standards across the European Union.

Meaningful information: The GDPR standard requiring controllers to provide explanations about automated decision-making that enable data subjects to understand the logic involved and effectively exercise their rights. The CJEU clarified that meaningful information must describe procedures and principles in ways that allow individuals to comprehend how their personal data influenced specific outcomes. This standard requires more than technical algorithm descriptions, demanding practical explanations accessible to average data subjects.

Data subjects: Individuals whose personal data is processed by organizations, entitled to various rights under GDPR including access, rectification, erasure, and objection. In automated decision-making contexts, data subjects have enhanced rights to understand and challenge algorithmic processes affecting them. The Austrian ruling reinforces that data subjects can demand detailed explanations of automated scoring systems, particularly when these systems influence third-party decisions about contracts, services, or opportunities.