UK's AI copyright clash: no-pay stance, OpenAI deals, and a policy rethink

UK creatives face uncertain pay for AI training as policy shifts beyond opt-out. Take steps now: contract clauses, technical opt-outs, and separate fees for training rights.

Categorized in: AI News Creatives
Published on: Sep 26, 2025
UK's AI copyright clash: no-pay stance, OpenAI deals, and a policy rethink

AI Training Without Pay? What UK Creatives Need to Know

A senior government adviser, Kirsty Innes, said AI companies will likely never be legally required to compensate artists, writers, or musicians for training data. Her now-deleted posts on X resurfaced in reporting, igniting debate over how the UK treats creative work in AI training. The timing matters: policy is still being written, and the stakes are your rights and your income.

Where UK Policy Stands

The government is consulting on whether AI firms should pay creators for training on copyrighted content. Labour's proposal has leaned toward an opt-out model, which many creatives say shifts the burden onto them and enables wide unlicensed use by default. Officials now signal that opt-out is no longer the preferred route, and working groups are reviewing alternatives. Expect an iterative process, not an overnight fix.

Publishers Are Cutting Deals

Major publishers are not waiting. The Guardian and Financial Times have licensed content to OpenAI for use in ChatGPT, locking in compensation and guardrails while policy catches up. This highlights a power gap: institutions can negotiate, while most individual creators cannot. It sets a market signal-content has a price-but doesn't solve compensation for the broader creative class.

Global Pressure Is Building

Legal challenges are rising. Disney and other studios sued Chinese AI firm MiniMax in U.S. federal court over alleged infringement related to protected characters. The case underscores a bigger point: enforcement is messy across borders, and AI training practices face scrutiny worldwide. UK policymakers are under pressure to balance innovation with real protection for creative work.

What To Do Now (Practical Steps)

  • Update contracts and briefs: add explicit "no AI training/use without written permission" clauses for clients, agencies, and platforms.
  • Use technical opt-outs where available: robots.txt for websites, noindex headers, and IPTC metadata on images to signal AI restrictions. It's not foolproof, but it builds evidence and intent.
  • Register your work where possible and keep dated originals. Strong documentation increases your leverage in takedowns and disputes.
  • Price and package licenses for AI use separately from standard usage. Make "training rights" an add-on, not a freebie.
  • Join or form collectives to negotiate at scale. Collective bargaining can secure better terms than solo outreach.
  • Track model outputs that mimic your style. Save examples, timestamps, and prompts; this supports enforcement actions if needed.
  • Audit your portfolio on major platforms. Adjust settings, remove uploads you don't want ingested, or watermark strategically without hurting your brand.
  • Ask clients directly about AI use in the brief. If AI is part of their workflow, align on compensation and attribution up front.

Signals to Watch

  • Outcomes from UK government working groups on AI and copyright.
  • New licensing deals between publishers and AI companies-these shape price norms.
  • Platform policy updates (training opt-outs, dataset disclosures, and model usage terms).
  • Major lawsuits and settlements that clarify what's allowed and what gets paid.

Want to build AI literacy without giving away your rights? Explore practical resources for creatives here: Courses by Job and AI Tools for Generative Art.

Further Reading