UK Consultation: 88% Say AI Firms Should License Training Data
The UK government's latest update on copyright and AI shows a clear message from respondents: AI developers should secure licences to train their models on existing works. 88% supported strengthening the law to make that explicit. Only 3% backed a new text and data mining (TDM) exception with an opt-out for rightsholders, and just 0.5% supported a full exception without an opt-out.
The update is an interim step ahead of a full report and economic impact assessment due by 18 March. For public bodies, this is a strong signal of where expectations are heading on data use, licensing, and procurement.
Why it matters for policymakers
Earlier this year, ministers indicated support for a TDM exception for AI companies. That drew a strong backlash from creative industries and many individual creators. While government has since softened its stance and says no option is preferred, it has not ruled out a new exception.
This week's update was published after commitments made during debates on the Data Act to hold off adding AI-copyright provisions. It keeps options open but confirms overwhelming feedback in favour of licensing obligations.
The numbers at a glance
- 88% support strengthening copyright law to clarify licensing duties for AI companies.
- 7% favour doing nothing, assuming current law already requires licences.
- 3% support a TDM exception with a rightsholder opt-out.
- 0.5% support a TDM exception without an opt-out.
- 10,000+ responses via the government feedback form; a further 1,400 submissions by email showed broadly similar views.
Creative sector perspective
Composer and AI expert Ed Newton-Rex, who has campaigned against a new AI copyright exception, says the consultation message "could not be clearer". In his words, "respondents overwhelmingly support the common sense position that AI companies should pay for the resources they use", and reject an "ill-thought-through 'preferred option' that would hand people's work to tech companies for free".
He argues the government should "listen to the people and rule out an AI copyright exception immediately". His stance reflects the dominant view from creative organisations and individual creators in the consultation.
Government's current position
The government notes that responses skew toward creative industries due to the volume of creators and stakeholders affected, while tech-sector submissions were more supportive of new exceptions. It lists recent roundtables and ongoing engagement with both sides.
The update states: "Copyright laws must protect creative works, whilst also ensuring the UK reaps the transformational benefits of AI... We are continuing to consider all options" and will publish a full summary and impact assessment by 18 March.
What departments and public bodies should do now
- Assume licensing will be required: Plan for procurement and compliance processes that favour AI vendors with clear data provenance and licences.
- Update supplier due diligence: Require attestations on training data sources, consent pathways, opt-out mechanisms, and indemnities.
- Budget for licensing: If your projects depend on model training or fine-tuning with third-party content, include licence costs and audit trails.
- Map datasets: Identify any copyrighted materials used in pilots or live systems. Document rights, retention, and deletion policies.
- Strengthen governance: Align model risk assessments with copyright risk, not just privacy and safety. Ensure legal sign-off before deployment.
- Prepare for scrutiny: Expect FOI interest and media questions on data sources. Have clear, defensible records.
What to watch next
- Full report by 18 March: This will set expectations for licensing, enforcement, and any narrow exceptions.
- Sector-specific guidance: Look for updates from the UK Intellectual Property Office and cross-government frameworks on AI assurance.
- International alignment: Consider how UK policy may interact with moves in Australia, the EU, and the US as you set procurement standards.
Authoritative resources
Optional upskilling
If you're planning internal upskilling on AI tools, risks, and policy impacts across roles, you can scan curated options by job role here: AI courses by job.
Your membership also unlocks: