UK copyright review on AI training: what creatives need to do now
The UK is reworking how copyright applies to AI training. Technology secretary Liz Kendall says she supports artists being paid when their work trains models, and she's talking with both the creative sector and AI firms to find a way forward. Big names like Paul McCartney and Elton John have called out unlicensed use of their work, and transparency is top of mind.
Here's the short version: UK law already bars unlicensed copying of protected works for commercial model training. The government is exploring a "machine-readable rights reservation" so creators can set terms or opt out at scale. A preliminary report is due by end of 2025, with a full report by March 2026.
What's changing (and what's not yet)
- Permission still matters: Developers may need licenses from rightsholders. With fragmented rights, that's messy and slow.
- Proposed fix: A machine-readable way to reserve rights or opt out. Useful in theory, but it needs standards and industry adoption.
- Global split: The EU runs an opt-out model for text/data mining. In the US, fair use for AI training is still being tested in court.
- Public sector pressure: Government buyers are asking suppliers to explain if and how AI is used, and to lock down training on user data.
Why this matters for creatives
You have leverage. If training needs permission, your catalog has value. As frameworks mature, expect more licensing, more provenance tooling, and more demand for clear usage rights. Campaigners are also urging the government to pause deals with AI firms in copyright disputes and require disclosure of training sources.
Practical steps you can take today
- Catalog your work: Keep a clean record (titles, dates, where published, owners, collaborators, ISRC/ISWC/ISBN/DOI where relevant).
- State your terms: Add clear licensing terms on your site and platforms. Use machine-readable signals (robots.txt, meta tags like "noai/noimageai") where they make sense for you.
- Add provenance: Use content credentials (e.g., C2PA) and watermarks for new releases to assert authorship and track usage.
- Join collecting bodies: If relevant, work with your society (e.g., PRS, PPL, DACS, ALCS) to stay plugged into licensing opportunities.
- Set boundaries with vendors: In contracts, ban training on your files unless you explicitly agree and are paid for it.
If you sell into the public sector (or work with vendors who do)
The Crown Commercial Service extended its AI Dynamic Purchasing System to February 2029, and new procurement guidance asks suppliers to disclose AI use in delivery. Buyers are asking for terms that prevent training on user data by default-think video calls with AI transcription that doesn't farm recordings into a model without permission.
- Ask for disclosure: What AI features are in the service? Where is data stored? Any subcontractors involved?
- Ban model training by default: Add a clause that forbids training on your content unless there's a separate, paid license.
- Demand provenance: Require content credentials and logs that show how your assets are used.
- Check opt-out support: Ensure platforms honor machine-readable rights reservations and bot/crawler identification.
Contract questions to put on repeat
- List every model/vendor touching our files. Where do models run? What data leaves our environment?
- Did you clear rights for any third-party training data used in your features?
- Will you disclose training data sources on request? How do you handle takedowns or contested sources?
- Can we audit or receive third-party attestations (SOC 2, ISO 27001, C2PA usage) to verify compliance?
- What's the indemnity if our works are used without permission?
Timeline to watch
- Consultation on machine-readable rights reservations: ongoing work since late 2024.
- Preliminary UK report on AI and copyright: due by end of 2025.
- Full report: due by March 2026.
- CCS AI Dynamic Purchasing System: extended to February 2029.
How to prepare your catalog for licensing
- Standardize metadata: Consistent titles, contributors, rights splits, contact/licensing email.
- Bundle rights by use case: Training, fine-tuning, and inference are different. Price them differently.
- Set minimums: Per-asset floors, volume tiers, and usage transparency requirements.
- Track distribution: Where your work lives online, and whether those channels respect AI opt-outs.
Resources
- UK Intellectual Property Office - official guidance and consultations.
- Crown Commercial Service: AI Dynamic Purchasing System - supplier/buyer framework details.
Skill up (for creatives using AI tools)
If you're producing with AI, pick tools that respect rights and support provenance. Start with platforms that are transparent about training data and give you control over opt-outs.
- Curated AI tools for generative art - identify options with clearer licensing and usage terms.
The ground is shifting, but the direction is clear: transparency, consent, and payment. Tighten your rights signals now, ask tougher questions of vendors, and line up your catalog so you're ready when licensing demand hits.
Your membership also unlocks: