Inside Cleveland's AI Rewrite Desk: A Working Model Writers Can Use
Cleveland.com and the Plain Dealer hired an "AI rewrite specialist" to turn reporters' notes into clean drafts, then run a human fact-check. The goal wasn't to pump out more content. It was to free reporters to spend more time reporting. By January, the desk was live and producing stories with an in-house version of ChatGPT from Advance Local.
Leadership framed it plainly: AI is an assistant. The journalist is the one who does the work, makes the calls, and owns the accountability.
What the setup looks like
Reporters feed notes, transcripts, or briefs to the AI rewrite desk. The desk drafts. A human checks facts, quotes, names, numbers, and context before anything publishes. Joshua Newman runs point on the desk; editor Leila Atassi oversees the workflow and quality.
Output carries only the reporter's byline unless the reporter's input was minimal (e.g., a press release). In those cases, the byline is shared with "Advance Local Express Desk." As editor Chris Quinn put it, "I look at AI as a tool, like Microsoft Excel is a tool."
The pushback-then the counter
Critics called out the idea of removing writing from reporters' workloads. The American Press Institute argued the case for AI needed more than anecdotes and asked for clearer proof of how it improves journalism. Others worried about quality, voice, and transparency.
The desk stayed the course. The stance: AI drafts; humans report, verify, and publish. Gina Chua, of the Tow-Knight Center, noted AI can write simple pieces "reasonably efficiently," especially with a human edit before it goes out.
What changed for reporters
Output volume stayed flat, but reporters gained roughly a day per week to be in the field. One example: reporter Hannah Drown covering a major land deal in Lorain County. With production time off her plate, she sat with residents, heard stories at kitchen tables, and filed work with more texture and depth.
Her take: an AI-assisted story won't look exactly like her draft, but it works "as long as we do our due diligence" and feed correct, complete notes.
Guardrails that actually matter
- AI drafts are always verified by the desk and the originating reporter.
- Quotes get extra scrutiny-they're the most common point of failure.
- Errors surface in draft, but none have reached publication, according to the team.
A practical playbook for writers
- Define inputs: detailed notes, links to source docs, clear context. Garbage in, garbage out.
- Use AI for structure: nut graf, lede options, outline, and first draft. You keep the judgment and voice.
- Fact-check like a hawk: names, dates, numbers, and every quote. If it matters, verify it.
- Decide disclosure rules upfront: when is shared byline appropriate? Be consistent.
- Measure time saved: reinvest it in reporting, calls, records requests, and scene-setting.
- Start with low-risk content: briefs, meeting summaries, recaps, routine explainers.
- Keep a prompt library: style notes, beats, recurring formats. Small systems compound. See: Prompt Engineering.
- Protect your voice: revise the AI draft to sound like you. Readers know when it's generic.
- Track failure modes: hallucinated quotes, invented facts, wrong attributions. Build checklists to catch them.
- Level up your capability, not just volume. If AI gives you an extra day in the field, use it. See: AI for Writers.
Where this could go next
The desk is exploring assigning the role to recent grads to sharpen their sense of what makes a good story. The philosophy is experimental: try, measure, adjust. The point isn't headcount cuts or more stories-it's better reporting.
For pros eyeing their own version, one more resource worth watching: the American Press Institute's AI coverage. Use the tool. Keep the accountability. Do more reporting.
Your membership also unlocks: