Back AI Training at Home to Secure Fair Pay, UK Creatives Urged

UK works fuel AI abroad with scant pay or control. Support a UK training framework with lawful sourcing, opt-outs, and fair fees to put leverage and recourse back at home.

Categorized in: AI News Creatives Legal
Published on: Oct 03, 2025
Back AI Training at Home to Secure Fair Pay, UK Creatives Urged

UK creatives: why backing domestic AI training could be your best path to fair pay

UK copyright rules currently block most commercial AI training at home. That sounds protective, but it pushes training offshore and makes it harder for UK artists and publishers to get paid or opt out.

Dr Andres Guadamuz, a senior lecturer in intellectual property law at the University of Sussex, argues for a strategic shift: support a UK legal framework that allows AI training with clear opt-outs and compensation. His view is simple: join the activity, shape the terms, and get paid.

What's happening now

Works by UK rightsholders are already being used to train models abroad. With training occurring outside the UK, enforcement options are limited and slow, often requiring action under foreign law.

Many UK authors cannot benefit from settlements like the recent Anthropic agreement, which require a US copyright registration. That leaves a growing gap between usage and compensation.

Why moving training into the UK helps

Keeping training offshore means fewer levers for UK creatives to pull. A domestic regime with lawful access, opt-outs, and payment terms would put negotiations on home turf.

Court trends also matter. In US cases, judges have indicated that using copyrighted material for training can be lawful if the data was acquired lawfully. The issue often isn't training itself, but how the data was obtained. A UK framework that permits training while enforcing lawful sourcing, opt-outs, and remuneration would align with that direction-and finally give UK authors practical recourse.

The jurisdiction problem, in plain terms

AI models do not respect borders. Even if UK law required permission for every work, developers could train elsewhere under looser rules and still ship models back into the UK.

Consider stock imagery. Litigation between Getty Images and Stability AI hinges partly on where model training occurred. If training happens in the US, UK creators face a tougher path to remedies at home.

What others are doing

  • EU: An opt-out regime exists under the Digital Single Market Directive, and there have been no clear negative shocks to creative industries. See the directive text on EUR-Lex.
  • Japan: Copyrighted works can be used for AI training if it does not unreasonably prejudice rightsholder interests, which attracts research while maintaining a healthy creative sector. Overview from Japan's Agency for Cultural Affairs: bunka.go.jp.

What UK legislators are weighing

AI firms have lobbied for a full text-and-data-mining exception that covers any UK work unless creators opt out. They argue that opt-outs complicate identifying usable content, and that transparency mandates would chill deployment.

Artists push back: a default right to scrape removes the need for companies to negotiate initial access, weakening control and pricing power. The core ask from creators is simple-opt-out by default or, at minimum, easy opt-out with clear reporting and fair pay.

In June, Parliament passed a bill allowing AI models to be trained on copyrighted material without rightsholders' knowledge, with supporters claiming disclosure would deter development and expose proprietary sources. Similar concerns are echoed in the US, with President Donald Trump calling it impractical to get permission from every artist for scraping.

The risk of tightening without a plan

If the UK tightens rules without offering a viable training pathway, companies will train models elsewhere-likely still on UK content-while British tech competitiveness suffers. Domestic firms can't match the scale of the US or China if their training options are capped.

Ironically, that leaves UK creatives with less leverage and fewer payment routes. Allowing training domestically under enforceable terms flips the leverage back.

Practical steps for creatives and legal teams

  • Lobby for a UK framework that pairs access with control: lawful sourcing, a simple opt-out mechanism, standardized reporting, collective licensing options, and clear remuneration pathways.
  • Push for an official opt-out registry: a centralized, machine-readable system that model developers must honor, with penalties for non-compliance.
  • Document and register your works: maintain verifiable records and metadata. Consult counsel on whether registering in the US could help eligibility in foreign disputes or settlements.
  • Use technical opt-outs where available: robots.txt, metadata signals, and platform-level controls. They are imperfect, but they build a record of intent.
  • Favor lawful data partners: work with platforms that respect opt-outs and offer per-use licensing; Microsoft and Cloudflare are exploring such models.
  • Join collective action: coordinate via trade bodies to negotiate sector-wide terms, rates, and audit rights.

Bottom line

UK works are already fueling AI abroad, with limited paths to payment. Bringing training home-under clear opt-outs, lawful sourcing, and enforceable compensation-gives UK creatives leverage, transparency, and a seat at the table.

Further resources