EU Digital Omnibus: Legal Experts Warn of an "Unlimited Special Legal Zone" for AI
The European Commission's planned "digital omnibus" aims to cut red tape. A fresh legal analysis, however, argues it would punch holes in the GDPR and put hundreds of millions of consumers at risk. Commissioned by the Federation of German Consumer Organisations (vzbv), lawyers at Spirit Legal say the draft creates a structural shift away from GDPR's tech-neutral safeguards and toward AI-specific privileges that favor large platforms.
The focal point is Article 88c. With an extremely broad definition of "AI system," companies could relabel almost any automated processing as AI-related and sidestep stricter GDPR requirements. That change would swap neutral, principle-based rules for a technology carve-out that benefits service providers.
Article 88c: From Tech-Neutral GDPR to AI Privilege
The draft relaxes constraints for processing sensitive data, including health data and political opinions. Even worse, it signals that processing becomes more justified as datasets grow. That flips data minimization on its head and rewards mass data extraction as long as it is tied to training models.
Experts Peter Hense and David Wagner call this an "unlimited special legal zone." In practice, the biggest winners would be Big Tech firms with the scale and infrastructure to exploit broadly worded exceptions.
Recitals Can't Do the Heavy Lifting
Key protections are pushed into recitals instead of binding articles. Example: technical opt-out mechanisms for users who object to data use. Without hard obligations, supervisory authorities lack teeth to enforce them.
This gap matters most with web scraping. People whose data is collected often never learn about the practice, let alone their right to object. If opt-outs live in recitals, they're basically optional in the real world.
Proposed Guardrails for AI Training
The expert opinion recommends a distinct legal basis for AI training. Access to personal data should be allowed only if a company proves its objective cannot be met with anonymized or synthetic data. That evidence threshold would protect individuals and reinforce necessity and proportionality.
They also stress preventing data leakage from models. Personal information must not reappear in outputs. That means enforceable technical standards and controls during training, not just policy statements after deployment.
Protecting Minors and the Right to Say "Stop" at 18
Minors often cannot grasp long-term implications of model training. The authors argue for explicit parental consent before processing data of children for AI purposes. No gray areas, no implied consent through default settings.
Once an individual reaches majority, they should have an unconditional right to prohibit further use of their data in existing models. Without this, a generation risks losing digital sovereignty before adulthood.
Political and Economic Stakes
According to vzbv board member Ramona Pop, the draft is framed as innovation but functions like a free pass for US platforms. Big Tech would exploit legal gray zones while European businesses and consumers bear the risk. Clear obligations create legal certainty; broad exceptions trigger years of litigation.
Survey results for vzbv underscore the business case for strong privacy. For 87% of consumers, trust is the foundation for using digital services. Over 60% are more likely to trust companies that visibly comply with European rules like the GDPR. Diluting those standards risks adoption of new technologies and market acceptance.
What Legal Teams Should Do Now
- Map all AI-related processing and classify where Article 88c could be invoked. Document risks of reclassifying standard automation as "AI."
- Require proof that anonymized or synthetic data cannot achieve the same goal before allowing personal data for training.
- Implement technical measures to prevent model memorization and personal data leakage; validate with red-teaming and output audits.
- Codify enforceable opt-out pathways in product code and contracts-do not rely on recitals alone.
- Prohibit scraping of personal data without a clear, lawful basis and user notice; monitor vendors for illicit collection.
- Set a minors policy: verifiable parental consent for training; auto-prompt and honor a "stop processing" right at age of majority.
- Build deletion/unlearning workflows now; ensure they can be executed without breaking model integrity or risking re-identification.
- Track the legislative process in Council and Parliament; plan for strict compliance scenarios to avoid whiplash later.
- Update vendor and data-sharing agreements: no onward training without necessity proof, leakage controls, and audit rights.
Relevant Texts
General Data Protection Regulation (GDPR): EUR-Lex
The Digital Omnibus will be debated in the EU Council and Parliament next. Civil society is pushing back hard. The central question for legislators: will Europe keep a tech-neutral, enforceable privacy regime-or carve out AI exceptions that create more uncertainty and less trust?
This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.
Your membership also unlocks: