AI Compliance Vendors

GDPR Article 22: automated decision-making explained

Article 22 of the GDPR gives individuals the right not to be subject to a decision based solely on automated processing that produces legal or similarly significant effects — with three narrow exceptions.

Last updated April 27, 2026 · Every fact traceable to a public source

GDPR Article 22 prohibits automated decisions that produce legal or similarly significant effects on individuals — unless the decision is necessary for a contract, authorized by Union or Member State law, or based on explicit consent. The CJEU’s SCHUFA ruling (C-634/21, 7 December 2023) confirmed that even credit-score generation can count as an Article 22 decision when third parties rely heavily on the score.

What does "solely automated" mean?

A decision is "solely automated" when no meaningful human review takes place — a rubber-stamp human approving every algorithmic output does not count as human involvement. The European Data Protection Board’s Guidelines 1/2024 on Article 22 (currently in consultation) clarify that human involvement must be carried out by someone with the authority and competence to change the decision.

What counts as a "legal or similarly significant" effect?

Examples include automatic refusal of a loan or insurance application, automated CV screening that rejects candidates, e-recruitment without human intervention, and credit-score thresholds that gate access to housing or services. The SCHUFA ruling brought credit-score generation itself into scope when the score is decisive for the third party’s decision.

What are the three exceptions?

Article 22(2) lists three: (a) the decision is necessary for entering into or performing a contract between the data subject and a controller, (b) the decision is authorized by Union or Member State law that lays down suitable safeguards, or (c) it is based on the data subject’s explicit consent. Even when an exception applies, the controller must implement suitable measures including the right to human intervention, the right to express a point of view, and the right to contest the decision.

How does Article 22 interact with the EU AI Act?

They are complementary, not duplicative. The EU AI Act regulates the AI system; Article 22 regulates the use of automated processing to make a decision about a person. A vendor or deployer must satisfy both — EU AI Act conformity for the system, plus Article 22 safeguards for the decision. Many high-risk AI uses listed in EU AI Act Annex III (employment, creditworthiness, public benefits) overlap squarely with Article 22.

What documentation should I keep?

A record of processing under Article 30, an Article 35 DPIA where the processing is high-risk, the safeguards in place under Article 22(3) (the right to human intervention, to express a view, and to contest), and the explicit-consent records or contract-necessity analysis if you rely on either exception. AI governance platforms typically generate this documentation as part of the impact-assessment workflow.

Related

Sources

Editorial independence

This FAQ is editorial. No vendor can pay to be highlighted or ranked in answers, and the written commentary on this page is payment-free. Featured slots in directory listings are always labeled where they appear. Read our methodology for details.