UK consultation shows strong public backing for tougher copyright rules on AI
The government's AI consultation drew more than 11,500 responses and a clear message: most people want stronger protection for creative work. Ninety-five percent supported either tightening the law and requiring licensing for training data, or keeping current rules. Only 3% backed the earlier idea of making artists opt out.
Respondents included creators and rights holders, AI developers, academics, legal professionals, and cultural heritage organisations. The scale and breadth of responses give policymakers a clear mandate to prioritise existing rights and licensing over blanket data access.
What respondents said
Creators and rights holders overwhelmingly favoured explicit licensing for any use of copyrighted works in training AI systems. Many also backed statutory transparency so rights holders can see what datasets were used and by whom.
Technology respondents were more mixed. They warned against heavy compliance costs and favoured lighter-touch solutions, but there was agreement that any framework should be practical to implement.
Transparency and licensing take centre stage
Support for mandatory transparency stood out. With clear reporting on training data, rights holders can identify use and negotiate licences. Without it, enforcement and fair payment remain guesswork.
For developers, that points to an operational shift: data provenance systems, auditable sourcing, and standardised disclosures. For creators, it means a path to remuneration instead of blanket "free use" under vague exceptions.
Creative sector concerns keep building
Musicians, authors, and visual artists argue that AI trained on their work without permission or payment devalues human creativity while concentrating benefits in tech firms. High-profile voices, including Kate Bush, Sam Fender, and the Pet Shop Boys, have urged ministers not to weaken copyright.
Paul McCartney recently released a nearly silent track as a pointed protest against what campaigners call copyright theft by AI firms. The message is simple: pay for the resources you use.
Government stance and timeline
In Parliament, the science, innovation and technology secretary, Liz Kendall, said there was "no clear consensus" and the government would "take the time to get this right." Even so, the consultation response signals significant resistance to opt-out models and broad text-and-data-mining carve-outs.
The government will continue work on a full report and an economic impact assessment under the Data Uses and Access (DUA) Act. These are expected to be laid before Parliament before 18 March 2026.
What this means for creatives
- Decide your licensing position: Define rates, terms, and use-cases you approve (training, fine-tuning, outputs). Put it in writing.
- Use contracts wisely: Add clauses that prohibit AI training without a separate licence. Close "data improvement" or "research" loopholes.
- Prepare for transparency: When disclosures arrive, you'll want a fast process to verify use and issue invoices or notices.
- Act collectively: Consider collecting societies or coordination with peers to set standard terms and reduce friction.
- Mark your work: Adopt content credentials/watermarking and keep detailed records of your catalogue and release dates.
What this means for government teams
- Codify transparency: Define practical disclosure standards (datasets, sources, weights, and update logs) with proportionate thresholds for SMEs.
- Enable licensing at scale: Support standard terms, collective licensing options, and simple reporting to cut transaction costs.
- Plan for enforcement: Provide audit powers, penalties for non-compliance, and a clear dispute path that doesn't swamp the courts.
- Safeguard research exemptions: Keep genuine research flowing while preventing commercial models from hiding behind it.
What AI developers should plan for
- Data provenance: Track sources, licences, and restrictions end-to-end. Make disclosures exportable.
- Rights-clear datasets: Shift to opt-in, licensed, or public domain sources where terms are unambiguous.
- Budget for licences: Treat training data like any other input cost. Build forecasting and royalty reporting into your ops.
- Minimise exposure: Secure data rooms, limited access, and retention controls to reduce legal and reputational risk.
Useful references
UK Intellectual Property Office: AI and IP consultation
GOV.UK: AI regulation white paper
Upskill your team
If you need a fast, practical briefing on AI policy, licensing, and compliance for creative or public-sector teams, explore focused learning paths here: Complete AI Training - Courses by job.
Your membership also unlocks: