Founding partner program — applications close 30 June 2026. Read the brief →

GlossaryCompliance

GDPR Article 22

The GDPR provision restricting solely-automated decisions that produce legal or similarly significant effects on a person, with limited exceptions and rights of intervention.

What it is

Article 22 of the General Data Protection Regulation (GDPR) gives data subjects the right not to be subject to a decision based solely on automated processing, including profiling, that produces legal effects or similarly significantly affects them. There are limited exceptions: necessity for a contract, authorisation by EU or member state law, or explicit consent. Even when an exception applies, the data subject retains the right to obtain human intervention, express their point of view, and contest the decision. Article 22 is one of the most operationally specific provisions in GDPR — and the one most directly affected by the rise of AI in enterprise decision systems.

Why it matters

AI tools that draft, score, recommend, or rank are not Article 22 decisions on their own — they become Article 22 decisions when no human meaningfully reviews the output before it produces effects on a person. Internal AI used for hiring, credit decisions, performance reviews, fraud detection, or risk scoring sits closest to the line. Vendors selling AI into regulated industries face a recurring procurement question: does your tool produce Article 22 decisions, and if so, what's the human-in-the-loop architecture? The answer determines whether the buyer's compliance team can deploy the tool at all.

How Norrsent handles it

Norrsent Copilot does not make Article 22 decisions. Every output — proposed risk, suggested control, drafted disclosure, audit response, capital decision recommendation — is routed to a named human for review and approval before it writes to the record or leaves the organisation. There are no autonomous writes, no auto-applied recommendations, no scoring that bypasses review. The architecture is documented and inspectable; the audit trail records every approval.

Norrsent Copilot

Common questions

Is profiling automatically Article 22?
No. Profiling is a separate concept (Article 4(4)) and is regulated more broadly. Profiling triggers Article 22 specifically when it leads to a solely-automated decision with legal or similarly significant effects.
Does 'human in the loop' satisfy Article 22?
Only if the human review is meaningful — not rubber-stamping. Regulators have signalled that approval workflows where the human always defers to the algorithm do not qualify as human intervention. The architecture must enable the human to override, contest, and adjust before the decision takes effect.
What's 'similarly significant effect'?
Less defined than 'legal effect', but case law and EDPB guidance treat it as decisions that materially affect a person's circumstances — denial of credit, a hiring decision, an insurance pricing change, a performance rating that affects compensation. Routine decisions (e.g., cookie consent classification) are typically below the threshold.