Backlash as ministerial aide says AI companies will never compensate creators amid Labour copyright row

A UK ministerial aide said big AI firms will never have to pay creators for training-then deleted the posts. As rules are drafted, artists push for consent, pay, and enforcement.

Categorized in: AI News Creatives
Published on: Sep 25, 2025
Backlash as ministerial aide says AI companies will never compensate creators amid Labour copyright row

AI Training, Copyright, and the UK: What Creatives Need to Know Now

A senior ministerial aide has argued that big AI firms will "never legally have to" pay creatives for training on our work. That claim, later deleted, has rattled musicians, artists, writers, and the wider creative economy.

The comment came from Kirsty Innes, now a special adviser to Liz Kendall, the secretary of state for science, innovation and technology. Her view lands in the middle of a government process that could shape how your work is used by AI-paid or unpaid-for years.

What was said

In February, months before becoming a ministerial adviser, Innes posted that regardless of beliefs about fairness, major AI companies "in practice will never legally have to" compensate content creators for training data. She also suggested the training could continue abroad "whatever our laws say," calling it "a bitter pill to swallow." The posts were later deleted.

Innes previously worked at the Tony Blair Institute (TBI), which has received large donations from Oracle founder Larry Ellison. Oracle backs the multi-hundred-billion-dollar Stargate initiative to build AI infrastructure alongside OpenAI and SoftBank, intensifying scrutiny over perceived alignment with big tech interests.

Why it matters

The UK government has been consulting on copyright and AI training. Earlier proposals floated an opt-out model letting AI firms use protected work by default unless rights holders say no. After pushback, the government said opt-out is no longer its preferred path and formed working groups with representatives from creative and AI sectors.

Prominent British artists-including Mick Jagger, Kate Bush, and Paul McCartney-have urged the prime minister to protect creators' human rights and their work. Some publishers have struck licensing deals with AI firms, showing there is a market for paid access when terms are clear.

Tension points you should track

  • Default vs. consent: Will use of your work require opt-in, opt-out, or something new?
  • Enforcement: Even with UK rules, training can happen offshore. Remedies must be practical, not just theoretical.
  • Licensing vs. scraping: Deals show a path to payment. Unlicensed training remains a flashpoint.
  • Conflicts and influence: Funding links to major tech and mega-projects raise questions about policy direction.

Reactions from the creative side

Campaigners warn that advising from those who echo big tech talking points risks sidelining public concern and creators' rights. They want advisors who reflect the majority view: people are anxious about unchecked AI use and concentrated power in tech.

Parliamentary voices have framed copyright as a human right. Any policy that weakens that right could face fierce resistance from the creative sector.

What you can do right now

  • Protect your catalog
    • Register your work with relevant collection societies and keep registrations up to date.
    • Use clear copyright notices and assert "no training" terms in your site and licensing agreements.
  • Reduce unlicensed ingestion (imperfect but useful)
    • Block known AI crawlers via robots.txt (e.g., GPTBot) and set meta tags where platforms support it.
    • Add provenance to your media using open standards like C2PA so claims about origin are easier to verify.
  • Negotiate AI-specific terms
    • Include clauses: no ML training without written permission, dataset transparency, per-use fees, audit rights, and takedown/deletion obligations.
    • For collaborators and clients, specify allowed AI tools, outputs, and licensing scope.
  • Engage in the policy process
  • Explore paid routes instead of passive exposure
    • Consider direct licensing arrangements where terms and compensation are explicit.
    • Use platforms that respect opt-outs or offer revenue sharing for AI training or synthesis.
  • Skill up for leverage
    • Understand AI workflows so you can set boundaries, price licensing, and spot misuse. Practical training for creative roles: AI courses by job.
    • If you work with visual or audio synthesis tools, map the tool's data policy before adoption.

What to watch next

Government working groups will define how consent, compensation, and enforcement could work. Expect heavy lobbying from both sides, proposals for collective licensing, and experiments in dataset disclosure.

Bottom line: Do not wait for a perfect policy. Lock down your rights, set your terms, and get ready to license on your conditions. If firms want your work, they should pay for it-and your contracts should make that unavoidable.