Liz Kendall signals reset on AI copyright, backing artists' demand to be paid

UK signals a reset on AI and copyright: pay creators, keep innovation moving. Expect licensing, training-data transparency, and tougher procurement rules to follow.

Categorized in: AI News General Government
Published on: Nov 24, 2025
Liz Kendall signals reset on AI copyright, backing artists' demand to be paid

UK signals reset on AI and copyright: pay creators, keep innovation moving

Technology secretary Liz Kendall has opened the door to a new deal between AI firms and the creative sector. Her message is blunt: "people rightly want to get paid for the work that they do," and the government wants "both sectors [to] grow and thrive in future."

It's a notable shift from the previous approach that leaned toward artists opting out of AI training by default. Kendall says she and culture secretary Lisa Nandy are "having a reset," bringing creatives and AI companies into the same room.

Why this matters for government teams

Public sentiment is firmly with creators. High-profile artists have protested the unlicensed use of their work to train large models. That translates into political pressure, legal exposure, and procurement risk for departments using or buying AI tools.

The government is consulting on an intellectual property framework for AI. For officials, the likely outcomes affect spending controls, contracts, compliance, and communications with the public and rights holders.

Key signals from Kendall

  • Payment and consent: Creators should be paid when their copyrighted works are used to train AI. Expect movement on licensing and compensation mechanisms.
  • Transparency: Kendall acknowledged the sector's demand to know whether their work has trained AI systems. Traceability will sit at the core of any workable deal.
  • Fresh mandate: She distanced government policy from past private comments by a newly appointed adviser, saying: "Views before you come to work for the government are not the views of the government."

Context: the legal and market backdrop

After a $1.5bn settlement by Anthropic, a database of roughly 500,000 books used to train its models was disclosed so authors could check usage and claim around $3,000. It's a template: transparency first, compensation second.

Meanwhile, campaigners including filmmaker and peer Beeban Kidron welcomed Kendall's stance but warned that trust with the creative community has been badly eroded. They want immediate measures: no new public sector deals with AI firms in copyright disputes, full training-data disclosure, and explicit commitments to respect copyright.

Timeline and what to expect

  • Initial report: due before year-end.
  • Substantial plan: targeted for March 2026.
  • Near-term pressure: Creatives want action now; Kendall says, "we've got to get this right," and "we don't want to have to choose."

What government should do now

  • Audit usage: Catalogue AI tools in use or in procurement. Note any that generate content or rely on broad internet-scale training data.
  • Tighten contracts: Add clauses that require suppliers to disclose training datasets, confirm rights clearance, and provide indemnities for copyright claims.
  • Require transparency: Ask vendors for model cards, data provenance summaries, and sources of licensed content. Make this a pass/fail criterion for new buys.
  • Protect public assets: Identify government-owned creative works and set clear rules on whether they can be used for training. Implement access controls where needed.
  • Prepare for claims: Define a process for notices from creators. Assign contact points in legal and procurement, and set response SLAs.
  • Coordinate centrally: Work with CCS and legal advisers to align boilerplate terms across departments and arm's-length bodies.
  • Pilot licensing: Where AI adds clear value, trial licensed datasets or collective licensing arrangements. Track cost per use and outcomes.
  • Communicate openly: If your service uses generative AI, publish a short notice explaining safeguards, training-data expectations, and complaint routes.

Policy options being discussed (what to watch)

  • Opt-in or opt-out standards for using copyrighted works in training, with enforceable consent signals.
  • Dataset registries or disclosures so creators can see if they were included and claim payment.
  • Procurement rules conditioning public contracts on transparency and respect for copyright.

Helpful references

Upskilling your team

If you're shaping procurement, policy, or service design around generative AI, structured training can shorten the learning curve and reduce risk.

The direction is clear: consent, transparency, and compensation. The task now is turning that into procurement rules, service standards, and clear communications-so innovation continues and creators get paid.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide