EU AI Act Article Explainer
Article 19 of the EU AI Act: Log Retention for Providers
Article 19 sets the provider-side log retention duty for high-risk AI systems — at least six months, counted per log event, with carve-outs for financial institutions and overlaps with data-protection law. Its deployer-side twin sits in Article 26(6). This explainer covers both and the line between them.
What Article 19 requires
Providers of high-risk AI systems must keep the logs referred to in Article 12(1), automatically generated by their systems, to the extent those logs are under their control. The logs must be kept for a period appropriate to the intended purpose and of at least six months, unless provided otherwise in applicable Union or national law, in particular in Union law on the protection of personal data.
The three scope-defining tests
- • "Logs referred to in Article 12(1), automatically generated" — the system's statutory record-keeping capability under Article 12. Business-logic metadata, application-level telemetry, and support-team annotations live outside Article 12 and sit under other retention rules.
- • "Under their control" — logs that never reach the provider's systems are not the provider's to retain. Deployer-resident logs sit under the deployer duty in Article 26(6), not under Art 19.
- • "At least six months" — the floor. Longer retention from other legal bases (product liability, GDPR lawful basis, contractual obligation) can raise it; a mandatory shorter retention in specific Union or national law can cap it.
The duty is about keeping the logs, not generating them. The generation side of the duty sits in the Art 12 record-keeping capability that every high-risk AI system must technically enable. Art 19 is the retention companion.
Article 19 vs Article 26(6): provider and deployer, not duplicate
Art 19 is the provider-side retention duty. Article 26(6) is the deployer-side retention duty [src]. Both require at least six months. They cover different log populations: the provider retains logs that flow back into systems the provider operates (telemetry uploads, diagnostic captures, incident reports), while the deployer retains logs generated and stored in the deployer's own environment.
| Dimension | Article 19 (provider) | Article 26(6) (deployer) |
|---|---|---|
| Who | Provider of high-risk AI system | Deployer of high-risk AI system |
| Which logs | Under the provider's control | Automatically generated by the system, under deployer's control |
| Minimum retention | Six months | Six months |
| Financial-institution carve-out | Yes — integrate into sectoral documentation | Yes — integrate into sectoral documentation |
The SaaS architecture question is "whose storage is this log in?" If both sides see the log, both sides retain. The streams rarely overlap in practice because of how high-risk AI systems are architected.
The data-protection overlay
Logs typically contain personal data: user identifiers, input content, inference outputs. Article 19 carves out applicable Union or national law, in particular Union law on the protection of personal data. In practice this means GDPR retention rules and Art 19 coexist; the six-month floor applies unless a specific Union or national rule sets a shorter mandatory period.
The "GDPR minimisation means short retention" reasoning is wrong
Minimisation is a lawful-basis constraint on the amount of data collected, not a generic retention shortener. If Art 19's floor is six months and no specific rule shortens it, GDPR minimisation does not override the AI Act. Document the interaction per log stream and settle on the longest applicable legal basis.
The financial-institution carve-out
Art 19(2) creates a specific carve-out for providers that are financial institutions subject to Union financial-services governance requirements. These providers maintain the AI-Act logs "as part of the documentation kept under the relevant financial services law" — the sectoral retention system subsumes the AI-Act duty rather than running in parallel.
This is meant to prevent parallel retention systems that duplicate cost and drift. If you are in scope, integrate the AI-Act logs into your existing sectoral documentation; do not build a second bucket.
When does Article 19 apply?
Article 19 applies from 2 August 2026 together with the rest of the high-risk regime [src]. If you will have a high-risk AI system on the EU market on or after that date, the retention configuration must be in place — not documented in a runbook but actually enforced by the storage layer.
For systems placed on the market before 2 August 2026, the Article 19 clock starts on the general-application date. Logs generated before that date do not need to be retroactively retained to the six-month floor, but from the compliance date the floor applies to all new events.
Penalties for inadequate retention
Failing to meet Article 19 is a provider-obligation breach and falls under the operator-obligations fine tier: up to EUR 15 million or 3% of worldwide annual turnover, whichever is higher [src]. SMEs and start-ups benefit from the inverted SME cap — the lower of the absolute figure or the percentage applies.
The practical enforcement route is a reasoned request from the market surveillance authority for logs covering a specific incident window. If the logs were aged out in 30 days by a cost-driven observability policy, the breach is self-evident; no investigation needed.
Check whether your AI stack triggers Article 19
Scan your AI stack to see whether you are a provider of a high-risk AI system — and if so, see the full provider obligation set including log retention. Free, no signup.
Scan Your AI Stack FreeRelated Guides
This article explains Article 19 of the EU AI Act (Regulation 2024/1689). It is not legal advice. The "under provider control" test varies by architecture, and the interaction between Art 19 and GDPR retention rules per stream requires case-by-case analysis. Consult qualified counsel for formal compliance assessment.