Amplify Creators, Not Big Tech: Make AI Training Pay Through UK Licensing

AI runs on human work, but creators are getting cut out. It's time for real licences, transparent deals, and proof of provenance so freelancers and artists actually get paid.

Categorized in: AI News Creatives
Published on: Dec 16, 2025
Amplify Creators, Not Big Tech: Make AI Training Pay Through UK Licensing

AI training needs your work - but you deserve a real deal

A UK government inquiry into copyright and AI training surfaced what many of us already feel: the people building AI need high-quality, human work to train it. At the same time, their past behavior has pushed creators away. As zero-click answers multiply and AI search eats attention, more walls are going up across what's left of the human-authored web.

This is the moment to amplify independent voices - the freelancers, artists, writers, photographers, illustrators - and not leave the market to closed-door deals between Big Tech and big media.

What creators told the Inquiry

Serena Dederding of the Copyright Licensing Agency (CLA) told the House of Lords Inquiry that businesses want lawful, high-quality content for training small language models and are actively looking for compliant ways to do it. There's appetite for licensing, especially collective solutions that are simple, clear, and auditable.

The CLA plans a voluntary, opt-in training licence covering text for training, fine-tuning, and RAG. The goal: open the door for the medium and long tail of authors and publishers who can't cut direct deals and need a straightforward way to get paid.

Deals aren't licences - and many creators are left out

Reema Selhi of DACS made a critical distinction: access deals with image banks or publishers are content agreements, not copyright licences. Those arrangements may feed models with high-quality assets and metadata - but they often bypass the actual rightsholders. If AI companies think they don't need to clear copyright, creators don't get paid.

That's extra galling when the UK's own committee work says the law is clear. For context, see the House of Lords Communications and Digital Committee's work on Large Language Models and legal certainty: official summary.

The money reality for visual artists

DACS paid out £60m ($81m) to 100,000 artists in 2024. That averages £600 per person - and it only reached two-thirds of members. Many artists still face late or missing payments, with fees that don't cover the work.

Publicly funded commissions have paid as little as £2.60 an hour. A 2024 survey of 1,200 artists found median earnings at £12,500 - far below the UK median of £37,430. 81% said their work was unstable, 51% already juggle second jobs, and even then they remain below the national median.

Policy drift invites bad behavior

With the UK drifting on policy, some vendors have pushed boundaries while others hesitate to invest in lawful pathways. Elsewhere, governments are taking clearer positions. Australia's move to rule out a text and data-mining exception was framed as support for creators and for licensing routes.

Meanwhile, the biggest payouts skew to high-profile conflicts. A notable US case saw authors push Anthropic into a $1.5 billion offer tied to 500,000 in-copyright books scraped from Library Genesis - about $3,000 per book. Useful, but it sets no legal precedent and doesn't solve access for everyone else.

What needs to happen next

  • Licences, not just "access." Content supply deals must not be confused with copyright licences. Creators need opt-in mechanisms, transparent terms, and payment tied to the use of their work.
  • Collective options for the long tail. Expand collective licences (text now; images, audio, video next) with clear distribution rules and easy enrollment for freelancers and small studios.
  • Proof of provenance. Standardize data provenance and model training records. Tools like a content provenance standard can help: C2PA.
  • Real enforcement. Make it costly to ignore copyright. No blanket TDM exceptions that sidestep licensing and remuneration.
  • Enterprise buyer pressure. If a company wants to use AI at scale, require vendors to show licence chains, audit trails, and opt-out honoring. No proof, no deal.

Practical steps for creatives today

  • Join your CMO. Writers and publishers: watch the CLA's training licence. Visual artists: register with DACS and keep your details current so you're reachable for new schemes.
  • Label your terms. Use clear copyright notices and machine-readable signals where possible. Keep original files, timestamps, and contracts organized.
  • Use provenance tools. Add secure metadata or provenance signals to new work to document authorship and permitted uses.
  • Track infringements smartly. Keep evidence, batch similar cases, and use collective action where available.
  • Choose ethical tools. Prefer AI vendors that prove licensed training data and offer opt-out controls. If they can't show it, assume they don't have it.
  • Skill up with intent. Build AI workflows that protect your IP and save time without giving away your catalogue. Curated options by job can help: AI courses by job.

The bottom line

AI needs human work to stay useful. If creators can't rely on payment or consent, we'll see fewer people making the work that keeps culture - and models - alive.

Licensing is the bridge. Build collective options for the long tail, enforce the law, and make vendors prove their inputs. If policymakers want growth, start by keeping creators in the economy - not under it.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide