WGGB backs Lords report on AI and copyright, urges transparency, fair pay and a right to human review

A Lords report on AI and copyright backs creator rights, real transparency, and fair, paid licensing. Government now must act or risk more fights over data, consent, and credit.

Categorized in: AI News Creatives Writers
Published on: Mar 07, 2026
WGGB backs Lords report on AI and copyright, urges transparency, fair pay and a right to human review

AI, copyright and the creative industries: what the Lords report means for writers and creators

Published on: Friday March 6, 2026

The House of Lords Communications and Digital Committee has released its report on AI, copyright and the creative industries. WGGB welcomed the findings and urged Government not to overlook the creative sector as an economic "powerhouse" in the rush to embrace generative AI.

At a high level, the message is simple: protect creators' rights, make AI companies transparent, and ensure licensing models actually pay the people who make the work. If you write, design, compose or build IP for a living, this directly affects your income, bargaining power and credit.

Key points WGGB is backing

  • The UK's copyright system is not outdated and does not need reform for AI's sake.
  • Government should rule out any exception that allows commercial text and data mining with an "opt-out" mechanism.
  • AI developers need a clear, statutory transparency framework (what data they train on, how, and for what).
  • Any UK licensing marketplace must be sustainable and include mechanisms that ensure remuneration reaches individual creators.
  • Remuneration through licensing must be fair and equitable.
  • Establish an independent regulatory body to monitor and govern AI expansion.
  • Guarantee a right to human review wherever AI is involved in decision-making.
  • Government should state its position on copyright and AI as soon as possible to reassure creators and the wider industry.

Why this matters to your income

Without strong copyright and licensing, your work can be ingested into training sets without consent or pay. Transparency gives you leverage-if you know how and where your work is used, you can negotiate or say no.

A functioning licensing marketplace is how money flows back to individuals, not just platforms and aggregators. Miss that, and creators carry the cost while others capture the value.

Practical steps to protect your work now

  • Use AI clauses in your contracts. Require explicit consent for any AI training on your work, clear crediting, usage boundaries, and meaningful remuneration. Include a right to human review where automated systems affect hiring, commissioning, or moderation decisions.
  • Keep clean records. Store dated drafts, final files, and publication logs. Good provenance helps you assert rights and prove use.
  • Set platform and site signals. If you publish online, configure robots.txt to block AI crawlers you don't consent to, and review any available opt-out signals offered by major models and platforms.
  • Prepare for licensing. Track where your work is distributed, note contributors and rights splits, and reference AI-related usage on invoices and deal memos.
  • Join relevant collecting bodies where applicable (e.g., authors, visual artists, composers) to strengthen rights management and potential secondary payments.

What to watch next

The ball is now with Government. Look for concrete movement on a statutory transparency framework, a fair licensing marketplace, and an independent regulator with teeth.

If those pieces land, creators keep control and get paid. If they don't, expect more disputes about training data, consent, and compensation.

Useful links


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)