UK ministers admit no "workable" AI copyright fix yet as policy gets a reset
UK ministers have acknowledged they don't yet have practical solutions for AI training transparency or rightsholder opt-outs. Culture secretary Lisa Nandy called the earlier preference for reform a "mistake," while technology secretary Liz Kendall said the government is "having a genuine reset moment."
The aim is clear: support the creative industries and enable AI progress, without leaving either side exposed. But ministers signalled that any package must be comprehensive and actually usable, not just well-intentioned.
What changed-and why it matters
The government had indicated a preference to expand text and data mining (TDM) for AI training with a rightsholder opt-out, backed by transparency obligations. After pushback, ministers now say that approach isn't ready for prime time and may never be without major technical advances.
This matters for policy teams because the timeline is real and public expectations are high. The government owes a substantive update by 18 March 2026, and working groups reconvene in February.
The consultation and the original opt-out idea
The 2024 consultation sought to balance control and remuneration for rightsholders with lawful access to high-quality data for AI developers. Trust and transparency were core themes, given the concern that works are being used for training without consent or fair payment.
Government's early preference: expand the current TDM exception for AI training, introduce a rightsholder opt-out, and require transparency from developers to make rights reservation practical.
Why the opt-out is stuck
Officials said the opt-out was "not a terribly popular preferred option" and outlined three blockers:
- It can shift the burden onto rightsholders-especially smaller operators-to actively opt out via potentially technical interfaces.
- If opting out is too easy, everyone may withdraw, defeating the point of expanding the exception in the first place.
- Attribution is messy. Mixed works (e.g., a blog quoting a novel) don't carry metadata that cleanly signals where an opt-out applies.
Conclusion from ministers: there is no "workable opt-out proposal on the table" right now. They've asked industry to help solve the technical pieces, noting other jurisdictions face similar hurdles.
Transparency is still hard
Government intends to legislate on transparency. The challenge is tooling. As Nandy put it, "There are insufficient transparency tools at the moment" to meet the commitments already made.
Expect direction on disclosures, but don't expect perfect traceability in the short term. The technical stack isn't there yet.
Licensing: light touch-unless smaller players get squeezed
Ministers signalled they won't "intervene overly" in licensing where industry is forming deals. However, they drew a clear line: if smaller players are disadvantaged, government may step in.
The creative sector is an ecosystem. Deals that work for large catalogues but shut out independents or SMEs are likely to get scrutiny.
Timelines and next steps
- Working groups meet again in February.
- Government must provide a substantive update by 18 March 2026.
- Ministers want coherence across transparency, licensing, and any copyright exceptions-no piecemeal fixes.
They acknowledge calls for urgency, but won't rush legislation that could "make a mess." The bar is solutions that hold up in practice.
What government teams can do now
- Map exposure: identify where your department or ALBs use AI systems trained on copyrighted data. Pinpoint contracts that require training data disclosures.
- Prep procurement: draft conditions for AI tenders that require transparent training data practices, rights-clearance assurances, and audit-ready logs.
- Plan for transparency laws: anticipate developer reporting duties and ensure your own projects can handle model provenance, dataset traceability, and updates over time.
- Support SMEs in your scope: where you fund or engage smaller creative entities, consider model clauses or frameworks that prevent one-sided licensing outcomes.
- Coordinate across sectors: impacts differ by sub-sector (music, publishing, images, news). Build tailored engagement plans and feedback loops with each.
- Scenario-test exceptions: model what expanded TDM with or without opt-out would mean for your policy area, enforcement risk, and stakeholder incentives.
Context and resources
For current UK guidance on copyright exceptions, including text and data mining, see the government's overview. It's useful for grounding contract and procurement language while policy evolves.
- UK guidance: Exceptions to copyright (Text and Data Mining)
- House of Lords Communications and Digital Committee
Bottom line
There's no off-the-shelf fix yet for AI training transparency or opt-outs. Government will likely legislate on transparency, keep a light touch on licensing, and act if smaller players lose out.
Use this window to tighten procurement, line up compliance for future transparency rules, and build sector-specific plans. The reset buys time-use it to reduce risk and keep options open.
Optional training resource: If your team is building AI literacy for procurement, governance, or project delivery, explore curated learning by role here: AI courses by job.
Your membership also unlocks: