'Don't Steal This Book': Thousands of authors protest AI at London Book Fair
An "empty" book is making the loudest noise at this year's London Book Fair. Titled "Don't Steal This Book," it contains nothing but thousands of author names - a clear protest against AI firms using books without permission or payment.
Printed on the back cover: "The UK government must not legalise book theft to benefit AI companies." Inside, no chapters. Just a roll call of writers drawing a line in the sand.
Who's involved - and why it matters
Signatories include Kazuo Ishiguro, Richard Osman, Alan Moore, Marian Keyes, Malorie Blackman, Philippa Gregory and Mick Herron. You can browse the full list of authors.
The campaign's core message is blunt: AI companies are copying millions of books to train models without consent or compensation. If that continues, the argument goes, we'll be left with empty pages, unpaid writers, and fewer books readers love.
What this means for working writers
- Your backlist and blog might have already been ingested into training datasets without your say.
- Generative outputs can compete with client work and series concepts, eroding advances, options, and freelance rates.
- "Opt-out" regimes shift the burden to you, not the companies using your work.
- If unpaid use becomes the norm, the professional pipeline (debut to midlist to bestseller) gets thinner.
Where policy stands right now
The UK government is due to deliver an economic impact assessment by 18 March and update its consultation on proposed copyright changes. One proposal would allow AI firms to use copyright-protected work unless rightsholders opt out - a move many British authors strongly oppose.
For context on the current law, see the UK IPO's summary of copyright exceptions for text and data mining (non-commercial research only under existing rules).
What you can do this week
- Audit your footprint: check what you've posted publicly (blog archives, sample chapters, PDFs) and what you're comfortable keeping up.
- Block known AI crawlers on your site via robots.txt (e.g., disallow GPTBot and CCBot). It won't stop everything, but it reduces casual scraping.
- Review contracts and boilerplate: add explicit clauses against text/data mining, model training, and derivative AI use without written consent and payment.
- Coordinate with your agent and publisher on licensing terms that forbid training use in ebook, audiobook, and aggregation agreements.
- Join and support author bodies (e.g., ALCS, Society of Authors) to amplify collective bargaining and legal guidance.
- Log suspected infringements and outputs that mirror your work. Keep dated evidence for takedown or claims.
- Use your voice: contact your MP with a short, specific note on how unpaid AI use affects your income and release schedule.
The symbol is the point
Composer and campaigner Ed Newton-Rex, who organised the book, says the AI industry is being built on work "taken without permission or payment," and that this is not victimless. The empty pages are a warning: remove income from the system and the pages stay blank.
The message to policymakers is simple: if AI companies want our books, they should license them - like everyone else.
Resources for writers
- Don't Steal This Book - campaign details and participating authors
- AI for Writers - practical workflows, rights-aware practices, and tools that put you in control
Your membership also unlocks: