UK consultation on AI and copyright: creatives want stronger protections
Creators across the UK have pushed back hard against the idea that their work can be used to train AI models by default. In responses to the government's consultation, the message is simple: more protection for rights holders, not less.
While a slice of the tech sector supported an opt-out approach for training data, most creative professionals argued for tighter control and clearer consent. The interim report, released on Monday, stops short of promising any specific policy in the final report.
What the interim report actually says
The government originally leaned toward allowing model training unless creators opted out through a future mechanism. Feedback shows only a small minority supported that position.
The interim report summarizes responses but doesn't commit to a final policy or timeline. That leaves space for stronger protections-or a return to the original opt-out idea-depending on how the final stage plays out.
What this means for your work
If your livelihood depends on licensing and distinct style, default opt-out would force you to constantly police how your work gets used. Opt-in puts control back in your hands. That's what most creatives asked for.
Until the government decides, assume the ground rules could shift. Use this window to tighten how you publish, license, and track your catalog.
Practical steps to protect your catalog now
- Audit your portfolio. List what's publicly available, where it lives, and which licenses apply. Close gaps where files are posted without clear terms.
- Set explicit licensing. Add clear terms on your site and profiles that restrict AI training without permission. Keep a standard license ready for inquiries.
- Use platform controls. Where available, toggle settings that limit dataset scraping or AI training. If a platform doesn't offer them, reconsider what you publish there.
- Embed signals. Add "no AI training" language in file descriptions and metadata. Keep originals watermarked or at reduced resolution where that makes sense.
- Track usage. Use reverse image and text search tools periodically. Keep a simple log of takedown requests and responses.
- Join forces. Coordinate with agencies, guilds, and collectives for shared licensing frameworks and legal guidance.
- Respond early. If a final consultation opens, submit your position and share templates with peers so more voices are heard.
If opt-out returns, prepare now
Nobody wants it, but if default opt-out reappears, being ready helps. Keep a maintained "do not train" list of URLs for your work.
Centralize contact details and rights statements on your website so any opt-out mechanism can reference one place. Document dates, titles, and links for quick submission.
What to watch next
- Policy signals. Look for clarity on opt-in vs. opt-out in the final report or follow-up announcements.
- Dataset transparency. Push for disclosure of training sources and simple ways to remove works from future datasets.
- Enforcement. Pay attention to remedies-notice-and-takedown is not enough without consequences for repeat misuse.
- Collective licensing. If the government favors group solutions, ensure creators control terms and pricing-not just intermediaries.
Useful references
Refresh the basics on copyright and exceptions, including text and data mining, via official guidance:
Bottom line for creatives
The consultation responses show clear momentum for stronger protection, but nothing is guaranteed. Keep control of your terms, tighten your publishing habits, and organize with your peers.
If policy lands in your favor, you'll be ready to license on your terms. If it doesn't, you'll have the documentation and systems to push back fast.
Your membership also unlocks: