UK Government Signals Shift on AI Copyright: Payment For Creators Back on the Table
The U.K. government is rethinking its stance on AI training and copyright. Technology Secretary Liz Kendall says the situation needs a "reset," and that transparency is central so photographers, writers, illustrators, and musicians can actually get paid when their work is used.
Previously, creators had to opt out to keep their work out of training datasets. Now, the tone suggests movement toward compensation and clearer disclosure. An initial report is expected before the end of 2025, with a more detailed plan by March 2026.
Why This Matters If You Create For a Living
If your work ends up inside a model's training set, you could see new paths to payment. That likely hinges on transparency rules: who used what, when, and how.
Expect more focus on dataset disclosures, audits, and licensing frameworks. The government wants both the creative sector and AI companies to grow - which means the money conversation is coming to the surface.
The Money Question: What's a Fair Rate?
A U.S. case involving Anthropic and a group of authors reached a $1.5 billion settlement with a figure around $3,000 per book included in training. It's a settlement, not a court ruling, but it gives negotiators and policymakers a reference point.
How that might translate to images, tracks, articles, or design files isn't set. We could see per-work rates, tiered pricing by usage and scale, or blanket licenses via collecting societies.
What To Do Now: Practical Steps for Creatives
- Inventory your catalog: what you've made, where it lives, and where it's been licensed.
- Register what you can (copyright/ISBN/ISRC/etc.) and keep paperwork tight.
- Embed metadata and credits; consider C2PA/Content Credentials where supported.
- Publish clear licensing terms on your site, including AI training permissions and rates.
- Create a simple rate card for AI uses (per work, per batch, per time period).
- Opt out on platforms that offer it if you don't want your work used, or set terms if you do.
- Join a trade body or collective that's pushing for fair licensing and can negotiate at scale.
- Track disclosures: if companies publish dataset sources, check for your work and document evidence.
- Watermark or use detectable signals if it fits your medium and audience expectations.
- Add AI clauses to client contracts (what's allowed, what's not, who gets paid for training use).
Policy Signals and Dates to Watch
- UK government: initial AI copyright report before end of 2025; fuller plan by March 2026.
- Transparency requirements: watch for rules on dataset disclosures and opt-in/opt-out mechanics.
- Collective licensing: potential frameworks through collecting societies or industry groups.
- International pressure: settlements and rulings abroad will influence negotiations in the UK.
Quick Context
A previous UK proposal leaned on opt-out for text and data mining. That approach triggered pushback from creators and was effectively paused. Current signals suggest a move toward payment and clearer visibility into training data.
Smart Positioning for Creatives
Don't wait for the final report to get your house in order. Label your work, clarify your terms, and prepare numbers you're comfortable with. If the industry shifts to paid training, you'll be ready to license instead of scramble.
Further Reading
Skill Up and Protect Your Edge
If you want to negotiate from strength, understand how models train, what platforms disclose, and where the licensing levers are. A little structured learning goes a long way.
Bottom line: payment is back in the conversation, and transparency is the unlock. Get organized now so you're ready to claim your share when the rules land.
Your membership also unlocks: