UK Tech Adviser: AI Firms Will Never Pay Creators as Labour Prioritizes AI Growth

A UK adviser said AI firms will never have to pay creatives for training data, then deleted the posts. Policy favours growth, pushing creators to fight for licensing and pay.

Categorized in: AI News Creatives
Published on: Sep 26, 2025
UK Tech Adviser: AI Firms Will Never Pay Creators as Labour Prioritizes AI Growth

UK Tech Secretary's Adviser: "AI Companies Will Never Have to Compensate Creatives"

A special adviser to the UK's Technology Secretary, Kirsty Innes, said AI companies "will never legally have to" compensate rights holders for training data. She posted the comments on X in February, then deleted them. Innes also wrote that IP "scrapping can continue to happen outside the UK, whatever our laws say," adding it "might be a bitter pill to swallow for some." She made the posts seven months before joining Liz Kendall's team at the Department for Science, Innovation, and Technology.

Innes previously worked at the Tony Blair Institute, which has received large donations from Oracle's co-founder. Oracle has signed a major partnership with OpenAI and is involved in the Stargate AI infrastructure project. OpenAI has opposed UK proposals that would let creators opt out of model training.

Why this matters for creatives

These comments signal a policy mood: growth and AI deployment first, creator compensation later-if at all. If IP scraping keeps happening abroad, local protections may be easy to sidestep. That weakens bargaining power unless creators act collectively or secure licensing deals.

Where UK policy stands

The government is reviewing consultation results on how to protect rights holders while advancing AI. One proposal: allow AI training on online content by default unless creators opt out. Creative industry bodies reject this, saying it shifts the burden onto creators and bypasses consent.

Some policy voices claim that partial opt-outs would skew models. Meta's Nick Clegg called seeking permission from every artist "implausible" at current scale, and warned that strict UK-only rules could "kill" the local AI industry. US President Donald Trump has also called consent at scale "impractical."

By May, former Technology Secretary Peter Kyle was reportedly moving away from opt-out and leaning toward licensing agreements. The government wants to attract tech investment and is in close talks with US AI companies. In June, Parliament passed a bill allowing AI models to be trained on copyrighted material without rights holders' knowledge, and some MPs argued transparency rules would deter releases and reveal proprietary data sources.

What the creative sector is saying

More than 70 artists and organisations - including Sir Elton John, Kate Bush, Sir Mick Jagger, and Sir Paul McCartney - signed an open letter accusing the Labour government of failing to protect copyright from AI misuse. The AI Opportunities Action Plan put growth first and proposed building a "copyright-cleared British media asset training dataset," but it offered little on enforcement or compensation. Meanwhile, AI developers including Anthropic, Meta, Perplexity, Stability AI, Midjourney, OpenAI, and Microsoft face lawsuits from artists, newsrooms, and labels.

What big tech wants

Tech firms prefer broad access to data, minimal friction, and limited disclosure. They argue that content sorting is expensive, that permissions at scale are unrealistic, and that opt-outs harm model quality. The net effect: the default tilts toward training now, negotiating later.

What you can do now

  • Block known crawlers: add rules for AI bots (e.g., GPTBot) in robots.txt. See OpenAI's guidance: Block or allow GPTBot.
  • Mark your site: use meta tags like "noai"/"noimageai" where supported, and watermark outputs where possible.
  • Register works and keep evidence: timestamps, originals, and version history strengthen your position in disputes.
  • License on your terms: use clear licenses, pricing for training vs. display, and API-only access for datasets.
  • Push for contract clauses: require clients to disclose AI use, forbid training on your work without a separate fee, and define liability for misuse.
  • Join a collective: guilds and rights bodies can negotiate framework licenses and handle enforcement more effectively than going solo.
  • Adopt provenance tools: attach content credentials that track edits and origin. Learn about the standard at the Content Authenticity Initiative.
  • Consider strategic exposure: share low-res or watermarked versions publicly; deliver full-res behind paywalls or contracts.

What to watch next

  • Whether the UK pivots from opt-out to licensing mandates or safe harbors tied to compensation.
  • How courts treat training on copyrighted material without consent, and whether precedent pushes companies toward paid deals.
  • Whether government incentives link public funding or procurement to fair licensing with creators.
  • The growth of per-use marketplaces, as companies like Microsoft and Cloudflare explore models that pay creators per call.

Skill up and protect your leverage

Two tracks matter: legal posture and market value. Tighten your IP posture, then increase the value of what only you can produce - voice, style, and formats that build direct audience demand.

If you want structured training paths and tools that help creatives work with AI without giving away the store, explore our resources: Courses by job and AI tools for copywriting.

The takeaway

Policy is leaning toward growth, and enforcement is patchy across borders. Assume default access to your public content unless you act. Lock down what you can, license what you want, and price in training rights from now on.