AI use by writers and editors becomes publishing industry's open secret

Major publishers have cancelled releases and editors have admitted using AI on work meant for prestigious outlets. The Mia Ballard case brought the issue to a head when Hachette pulled her horror novel after an investigation found heavy AI reliance.

Categorized in: AI News Writers
Published on: Apr 29, 2026
AI use by writers and editors becomes publishing industry's open secret

Publishing's AI problem moves from secret to scandal

The literary industry faces a crisis it can no longer ignore. In recent months, major publishers have cancelled releases, agents have reported surging AI-assisted submissions, and editors have admitted using generative AI on work meant for prestigious outlets.

The Mia Ballard case crystallised the problem. In March, Hachette cancelled the US release of Shy Girl, Ballard's horror novel, after The New York Times investigation found heavy AI reliance. Ballard denies the claims, saying a freelance editor used AI without her knowledge during self-publishing. The book had already been released in the UK.

Other cases followed quickly. Literary critic Alex Preston admitted using AI to "tidy up" a book review for The New York Times. Reform Party candidate Matt Goodwin faced accusations of reproducing "AI hallucinations" in his self-published book Suicide of a Nation, though he said he used AI only for secondary data analysis and cross-checked results against original sources.

Agents now complain about manuscripts arriving with AI assistance baked in. More troubling: editors upload those manuscripts to ChatGPT to generate quick summaries and competitive comparisons-tasks that once required human judgment.

The scope remains murky

Publishing houses won't admit the scale of the problem. One independent editor said the use is "not a secret within publishing at all," citing instances of AI translation software on foreign-language manuscripts and suspected AI-generated marketing copy.

Hellie Ogden, president of the Association of Authors' Agents, said she's seen pitches from editors to authors written entirely with AI. But she doubted editors would openly admit to uploading full manuscripts to ChatGPT. "The line from the top would certainly be that they are not," she said.

Proving AI use remains difficult. Publishers can claim ignorance when rogue editors or writers introduce it quietly. The industry has created conditions where denial is easy.

How publishing enabled the problem

The crisis stems partly from how the industry operates now. Publishers have shifted toward buying formulaic genre fiction-romantasy, horror, and other trope-heavy categories-to drive revenue. Science fiction and fantasy sales alone generated around £86 million for the British book industry in 2024.

These genres encode character types and plot points predictably. AI can reproduce them at scale. As one agent noted, in a world of "#enemiestolovers" and "#darkromance," combining ingredients to fulfil a recipe has become commercially dominant.

Editing has declined, particularly at larger commercial houses. Editors now split time between editorial work and marketing strategy. Sales and marketing teams drive commissioning decisions. "In the good old days, publishing houses were full of editors who spent all day editing books," one established non-fiction author said. "Now you have vast teams of people in sales and marketing."

Without rigorous editing, AI flourishes. No one asks whether paragraphs are necessary or properly placed. Publishers have enabled the problem through resource cuts and shifting priorities.

Why writers might use it

Self-publishing platforms like Kindle Direct Publishing have democratised authorship. A first-time author told to add more dialogue might face a choice: hire an editor for £1,000 or use AI. The decision becomes obvious.

Laziness and money matter too. Shifting copies in bulk rewards speed over craft. For some writers, AI becomes a shortcut rather than a tool.

What comes next

Booker Prize-nominated writer Sarah Hall issued her novel Helm with a sticker declaring it "human written." The Society of Authors launched the "Human Authored" initiative as an opt-in labelling scheme. Anna Ganley, the society's chief executive, said clear ethical standards are needed. "Because generative AI has seeped into so many areas," she said, "we're going to get more examples like Shy Girl unless both authors and publishers are held to clear ethical standards."

The bigger fear: today's minor efficiency hack becomes tomorrow's standard practice. One leading editor warned of "incremental usage"-AI handling translation, then final polish, then full editorial decisions. "There's a real risk that we end up with a huge sea of sludge, with probably a clutch of small independents producing books by humans," the editor said.

Some industry voices remain hopeful. George Walkey, an industry consultant, said AI can produce books but can't discover them. "Writing something that people really care about is a very different thing," he said.

One agent suggested the crisis might force a reckoning. "The rise of AI might, perversely, rehabilitate true creativity among the editors as much as among the real authors," the agent said. Readers clearly care-the backlash over Shy Girl and Goodwin proved that. Whether publishers listen depends on whether they see readers' preferences as a market signal or an obstacle to efficiency.

For writers, the immediate question is clearer: as AI tools proliferate, understanding how they work and where they fail becomes essential professional knowledge.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)