McClatchy Journalists Refuse Bylines Over AI-Generated Content
More than 30 journalists at the Sacramento Bee have withheld their bylines from stories produced by McClatchy's "content scaling agent," a generative AI tool that creates new articles by repurposing reporters' existing work. The union sent a letter to management on March 27 stating they will not allow their names on such pieces.
The tool summarizes and rewrites reporter stories under new headlines, sometimes tailoring versions for specific audiences or creating roundups from multiple articles. McClatchy began rolling out the system last month across its 30 markets, which include the Miami Herald and Charlotte Observer.
Why journalists are pushing back
Ariane Lange, an investigative reporter at the Bee and vice chair of its union, said the tool undermines credibility and insults the profession. "We don't want the public to think we have anything to do with it," she told TheWrap. "We think it's a betrayal of the public's trust."
Lange raised a specific concern about sources. "I've written about some really tough things in my career - domestic violence, sexual assault, horrible traumas," she said. "I don't want to have to explain to a trauma victim that they can trust me with their story, but I cannot guarantee that it won't be fed into a glorified chatbot."
The Bee's union cited two contract violations: management failed to give advance notice of the new AI tool, and journalists invoked a clause allowing them to withhold bylines from stories they object to.
How McClatchy is labeling AI content
The company's approach varies by newsroom, apparently based on union agreements. At the Miami Herald, stories are labeled "produced using AI based on original work by" the reporter. The Centre Daily Times in Pennsylvania uses "produced with AI assistance," while one Bee story ran with "edited by Sacramento Bee staff" and "produced with AI assistance."
Bee executive editor Chris Fusco agreed not to attach reporter bylines to AI-generated pieces after meeting with union leaders on April 1, though journalists said this alone doesn't address their broader concerns.
McClatchy's wider AI expansion
The newspaper chain has used automation for years. It launched the "Miami Herald Bot" in 2021 to write real-estate stories, then developed a hurricane-coverage bot. After generative AI advanced in 2022, McClatchy expanded to AI summaries and now the content scaling agent.
The company's job listings increasingly demand candidates know how to "leverage AI tools" for reporting efficiency. McClatchy also encourages reporters to use internal AI for search-optimized headlines.
Michael Lycklama, a sports reporter at the Idaho Statesman and chair of its union, said the rollout feels disconnected from results. "It seems to kind of just be AI for AI's sake," he said. "Anytime we ask, 'Well, how do we know this is working?' we can't really get an answer."
Union negotiations over AI
The Pacific Northwest Newspaper Guild, representing McClatchy papers in Washington and Idaho, negotiated protections during recent contract talks. The tentative agreement bans deepfakes of reporters and requires human involvement when AI content relies "substantially" on a reporter's work - language that would likely restrict the content scaling agent's use.
Bryan Clark, an opinion writer at the Statesman and vice president of the guild, said McClatchy pushed for maximum flexibility. "We thought there were lines in the sand that should be non-negotiable matters of basic journalistic ethics," he said.
Charlotte Observer and Miami Herald union leaders are now discussing the tool's impact with newsroom management. Both outlets have established that using the system is optional for reporters.
Broader industry concerns
A Pew Research survey found that 51% of people believe AI will negatively impact the news they receive. News organizations risk damaging reader trust with even single missteps in how they deploy the technology.
Other newsrooms are experimenting with AI-generated content. Business Insider publishes AI stories with human editing, while a Fortune editor has produced over 600 articles in eight months using AI. The New York Times and ProPublica have both faced union disputes over AI, with the latter's union authorizing a strike partly over the company's refusal to ban AI-related layoffs.
Gina Chua, executive director of the Tow-Knight Center at CUNY's Craig Newmark Graduate School of Journalism, said history shows companies should implement major changes with employee buy-in rather than impose them on reluctant staff. "Change is always difficult," she said. "When you need to make changes, you have to find ways to move people along."
McClatchy did not respond to requests for comment about the tool or make executives available for discussion.
Learn more: AI for Writers and Generative AI and LLM resources explore how these technologies work and their implications for content creation.
Your membership also unlocks: