AI Companies Should

Josef Drexl proposes a flat AI levy to pay authors, reporters, and creators, tackling displacement without direct copying. He also urges a citizen fund for quality reporting.

Categorized in: AI News Creatives Legal
Published on: Mar 06, 2026
AI Companies Should

AI Should Pay Back: A Flat-Rate Levy for the Press and Creatives

German legal scholar Josef Drexl proposes a simple, blunt fix: require AI providers and operators to pay a levy that goes to authors, journalists, and creatives. His point is straightforward-high-quality models depend on high-quality, human-made works. If the pipeline of fresh, human output dries up, models hallucinate more and start to fail.

This isn't just a technical concern. It's a social one. Democracies rely on journalism to inform, investigate, and hold power accountable. Cultural creators do similar work for our shared values and identity. Generative AI can imitate style and pattern, but it can't replace those civic and cultural functions.

Why a levy now

Copyright law doesn't bite if a model doesn't copy in a way the law recognizes. And yet AI content competes directly with human work. Drexl argues that "AI displaces human-created works from the market without copying them." That gap-displacement without direct infringement-is where current law falls short.

His answer is a flat-rate levy on AI services. Companies could live with it because they rely on human-created training data to build valuable systems. No high-quality human works, no high-quality models.

How a flat-rate levy would work

  • Applies to providers and operators of generative AI services.
  • Collected centrally and distributed to authors, journalists, and creatives.
  • Proof of specific use isn't required-solving the "did you train on my work?" gridlock.
  • Even creators whose works weren't used still benefit, because they're directly exposed to AI-driven market displacement.

This approach cuts out lengthy negotiations and evidence fights that stall licensing discussions. It also aligns incentives: AI growth funds the human creativity it depends on.

Why copyright alone falls short

Existing copyright frameworks struggle with model training and outputs that don't neatly qualify as copies. Recent disputes, such as the GEMA vs. OpenAI conflict, show the legal friction but don't resolve it at scale. The levy sidesteps that by focusing on economic reality: AI competes with human work and should help pay for the inputs and the social infrastructure it draws from.

GEMA and other rights organizations will remain important, but Drexl's view is that enforcement case-by-case won't fix market-level displacement.

A second pillar: a citizen levy for journalism

Drexl also sketches a public funding stream for journalism, similar to a broadcasting contribution. Citizens would allocate funds annually to eligible publishers who meet clear quality standards. A minimum share of publisher costs would need to go to journalist pay-creating a counter-incentive against replacing newsroom labor with generative AI.

Think of it as stabilizing the supply of public-interest reporting-investigations, local beats, and specialized coverage-where ad-driven models and AI summarization tend to hollow things out.

What this means for creatives and legal teams

  • Economic recognition: Your work is not just "content"-it's infrastructure for AI systems. A levy formalizes that.
  • Less friction: Distribution without proof-of-use avoids endless detection, takedown, and litigation cycles.
  • Broader coverage: Creators affected by market displacement benefit, even if their specific works weren't in training sets.
  • Journalism safeguard: A citizen levy ties funding to standards and fair pay, protecting reporting from being hollowed out.

Open questions

  • Scope: Which AI services pay? Foundation model providers, app-layer tools, cloud inference?
  • Rates and tiers: Flat fee, usage-based, or revenue-based? Global vs. national frameworks?
  • Distribution: How to weight payouts across journalists, authors, visual artists, musicians, and independent creators?
  • Quality standards: Who sets them for publishers? How are audits run without burying small outlets?

Next steps for stakeholders

  • Publishers and newsrooms: Document AI displacement effects, propose quality and pay metrics, and prepare for fund allocation mechanisms.
  • Creators and guilds: Build cross-sector coalitions to define fair distribution rules and representation on collection bodies.
  • AI providers: Model levy scenarios, map data dependencies, and co-design practical reporting that doesn't expose trade secrets.
  • Policymakers: Start with pilot programs, simple rates, and transparent audits. Iterate with clear sunset and review periods.

The bigger picture

This idea isn't brand-new, but execution has lagged. The market moved first; policy is catching up. A flat-rate levy plus a citizen-backed journalism fund is a pragmatic middle path: less courtroom theater, more steady funding for the human work society actually needs.

Learn more about Drexl's institution, the Max Planck Institute for Innovation and Competition.

For deeper context and practical training paths, see AI for Legal and resources for creators at AI for Creatives.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)