Stolen, Not Scraped: UK Creatives Demand AI Transparency, Licensing and Fair Pay
Creative leaders say AI firms use work without consent and demand transparency, licensing, and fair pay. Publish training sources, pay creators, and respect their terms.

"Stolen, not scraped": Creative leaders demand transparency, licensing, and fair pay from AI firms
Creative industry leaders are calling for built-in transparency and paid licensing as AI companies train models on creative work at scale. At a Creative UK panel during the Labour Party Conference in Liverpool, representatives from BPI, the Publishers Association, and DACS said collaboration is possible-if consent, credit, and compensation are guaranteed.
They warned that current data practices resemble unlicensed extraction, not partnership. The message to tech: publish what you train on, pay for the value you capture, and stop shifting the burden onto individual creators.
What's at stake
Reema Selhi (DACS) cautioned that without protection and fair terms, freelancers across visual art and music will lose paid work as generative systems replace entry-level and mid-tier gigs. Mandy Hill (Publishers Association) was blunt: "stolen, not scraped."
Labour's proposed "rights reservation" model-an opt-out by default-was criticized as impractical. Sophie Jones (BPI) alleged mass-scale scraping that likely breaches current law and called the situation a serious assault on the foundations of the creative industries.
The ask is simple: transparency obligations for AI developers, enforceable licensing, and clear reporting. That's how you rebuild trust and create room for innovation that benefits both sides.
Context: policy and practice
The UK government has convened working groups on AI and copyright as part of its Modern Industrial Strategy. For reference on current exceptions and limits, see the UK guidance on exceptions to copyright.
Creators and publishers argue that consent-based licensing-not forced opt-outs-will lead to better tools, better data, and fairer outcomes. Trust follows clarity, not ambiguity.
What this means for working creatives right now
- Assert your rights by default. Add clear copyright notices and usage terms to your site, portfolio, and file metadata (IPTC "Copyright" and "Usage Terms").
- Set technical boundaries. Use robots.txt and headers (e.g., X-Robots-Tag: noai, noimageai where supported) to signal no training without a license. It won't stop bad actors, but it strengthens your position.
- License proactively. Offer explicit terms for AI training and synthetic output usage. Join collecting societies where relevant and align with collective licensing efforts to increase leverage.
- Add provenance. Explore content credentials (C2PA) to attach tamper-evident metadata to your work. Start here: Content Authenticity Initiative.
- Tighten contracts. Include clauses that prohibit model training, dataset creation, or embeddings from your work unless separately licensed; require audit logs and deletion on breach.
- Monitor usage. Schedule reverse image/text searches, set alerts for distinctive phrases, and track unusual traffic to your portfolio.
- Choose ethical tools. Prefer AI products that license their data, disclose sources, and pay creators. Vote with your workflow.
- Speak up. Respond to consultations, back your trade bodies, and share case studies of lost work or fair deals that worked.
What AI companies can do to earn trust
- Publish dataset lists. Name sources, volumes, and timestamps. Provide searchable registries and opt-out that actually works.
- License at scale. Build an opt-in marketplace for training rights with clear rates, revocation, and reporting.
- Pay for quality. Compensate rightsholders for training, fine-tuning, safety data, and evaluation content.
- Prove provenance. Implement C2PA in outputs and maintain audit logs linking outputs to training cohorts.
- Respect contracts. No "fair use" wishful thinking-honor regional law and creator terms, and act fast on takedowns.
Practical learning for creatives who want to work with AI-on your terms
If you plan to integrate AI into your creative workflow, choose tools and methods that respect licensing and credit. For structured options, see AI courses by job to build skills without compromising your rights.
The bottom line
Creatives aren't anti-tech. We're anti-extraction. Build transparency into AI development, pay for what you use, and you'll get willing partners-and better data.
Consent. Credit. Compensation. That's the path to useful AI that creators actually support.