Newsrooms Test AI for Writing Tasks, But Trust Remains the Barrier
A newspaper in Cleveland hired an AI writing specialist in January. The software doesn't report - it writes. Reporters now spend an extra day each week gathering information instead of typing up their findings, according to the Cleveland Plain Dealer.
The move sparked immediate backlash. Editors at other outlets expressed deep skepticism about letting AI anywhere near newsroom workflows.
What AI Can Actually Do
ChatGPT can generate headlines. It can write recaps of baseball games. It can speed up routine research tasks. But these capabilities don't address what editors see as the core problem: AI makes mistakes it can't explain, and newsrooms can't afford those mistakes.
Mandy Gambrell, editor of Hamilton's Journal-News, tested AI by giving it a dataset with specific instructions. "We asked it not to add or delete things - and then it spit back exactly what we asked it not to do," Gambrell said. "When we asked AI why it added data, it responded that it simply did not know."
Aidan Cornue, editor at the Oxford FreePress, framed the issue differently: "AI ruins the trust that needs to be built between people and news organizations."
Where AI Fits
Some journalists have found narrow uses. Patti Newberry, an enterprise reporter at the Cincinnati Enquirer, uses Microsoft's Copilot software to speed up routine searches while reporting - checking when laws passed, where politicians went to school, what previous charges a subject faced.
"I conduct searches all day long as I report a story," Newberry said. "AI helps with that part."
The distinction matters. AI handles research acceleration. Humans handle reporting, verification, and judgment.
The Labor Question
Union organizers are pushing for contracts that give reporters control over their bylines and how AI gets used in newsrooms. A major concern: AI-generated copy added to reporters' stories without approval.
These protections address a real risk. As newsrooms cut staff, the temptation to automate writing increases. But smaller papers especially need more reporters, not fewer.
The Unregulated Problem
All three editors cited the same issue: AI remains experimental and unregulated. It hallucinates. It invents details. It contradicts its own logic.
Until that changes, AI for writers in journalism means assistance with research and formatting - not reporting, not fact-checking, and not replacing the judgment that separates news from noise.
Your membership also unlocks: