House of Lords committee urges UK to back licensing-first AI, mandate training transparency, and reject opt-out TDM to protect creators

UK Lords push a licensing-first approach to AI and reject a broad TDM opt-out. They want real transparency on training data and new rights over style, voice, and digital replicas.

Categorized in: AI News Creatives
Published on: Mar 11, 2026
House of Lords committee urges UK to back licensing-first AI, mandate training transparency, and reject opt-out TDM to protect creators

UK Lords Committee to Government: Protect Creators in the AI Era

(The Committee is appointed by the House of Lords in each session to highlight to parliament and the public, areas of concern within the media, digital and creative industries.)

A new 85-page report urges the government to back a licensing-first approach to AI and rule out a broader text and data mining (TDM) exception with opt-out. It calls for statutory transparency about training data, new protections for digital replicas and "in the style of" uses, and support for technical standards that give creators real control.

Baroness Keeley put it plainly: "Our creative industries face a clear and present danger from uncredited and unremunerated use of copyrighted material to train AI models."

The headline message

UK copyright law is not broken. The Committee says the harm comes from unlicensed ingestion of protected works and a lack of transparency-not gaps in the law. Training on protected content is likely reproduction and needs permission unless a statutory exception applies. The government should champion licensed AI development, not weaken copyright to chase speculative gains.

What the report covers

  • General introduction
  • AI, the creative industries and copyright reform
  • Transparency
  • Emerging technical solutions
  • Licensing

What this means for creators

  • Unlicensed training threatens income across photography, music, publishing, film, and software.
  • UK copyright is described as a global "gold standard" and already covers AI training as reproduction.
  • There's a gap: no clear protection for your voice, style or likeness if an AI imitates you without copying a specific work.
  • The Committee wants a licensing market that pays individual creators, not just big rightsholders.

Key recommendations at a glance

  • Rule out a new commercial TDM exception with opt-out. Provide immediate clarity to end uncertainty.
  • Introduce new rights over identity, style and digital replicas, including "in the style of" outputs.
  • Make training-data transparency a statutory obligation, with granular disclosures and proportionate rules for smaller firms.
  • Support a fair UK licensing market, working with CMOs to ensure revenue reaches individual creators.
  • Back open, interoperable technical standards for rights-reservation, provenance and labelling-and legislate if needed.
  • Prioritise sovereign, UK-governed AI models with strong transparency and built-in respect for copyright.

How the Committee reached its view

Seven oral sessions. Twenty-one witnesses from Google, Meta, Microsoft, OpenAI and leading creator bodies. Evidence from government ministers and academics. Plus 29 written submissions. The result: a well-informed push for licensing, transparency and creator control.

Practical steps you can take now

  • Audit your exposure: search for your works in popular datasets, models and platforms. Keep screenshots and dated records.
  • Use rights signals: apply "do-not-train"/rights-reservation metadata where available; implement robots and platform-level controls for content access; consider content fingerprinting and watermarks.
  • Adopt provenance tools: embed signed metadata and C2PA-style content credentials to track origin and edits.
  • Get your paperwork right: update contracts to cover AI training, style imitation and synthetic replicas. Set clear licensing terms and rates.
  • Work with your CMO or professional body: ensure you're set up to receive AI-related royalties as deals emerge.
  • Price your style: if you're open to licensing training or "in the style of" usage, set boundaries and fees; if not, say so clearly in T&Cs.
  • Monitor platforms: report lookalike outputs that confuse audiences or trade on your identity; keep an evidence trail for takedowns or claims.

Timing and what's next

The government is due to publish its economic impact assessment and its copyright-and-AI report (required by the Data (Use and Access) Act 2025) by 19 March 2026. The Committee asks for immediate clarity now: confirm there will be no new TDM exception with opt-out. It also wants a final policy decision within 12 months.

For now, ministers have signalled more consultation before a firm position. That uncertainty is slowing licensing and investment. The Committee's blueprint gives a clear route: licensing first, transparency backed by law, enforceable rights over identity, and technical standards that make compliance practical.

Why this approach is creator-first

Licensing pays the people who make the work. Transparency exposes who's using what, so deals can be done and enforced. Technical standards reduce friction, so small studios and independents can participate, not just the majors. And sovereign UK models trained responsibly could set a higher bar for the industry.

Want more context? See the UK Parliament's Communications and Digital Committee page here, and learn about content provenance standards at C2PA.

Resources for creatives

Bottom line: keep creating-but protect your catalogue, set your terms, and be ready to license on fair ground. The policy winds are moving your way. Use this moment.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)