Kansas Reflector pledges to keep AI out of its stories and columns

Kansas Reflector and its parent States Newsroom will not publish AI-generated stories, images, videos, or audio. The policy allows AI for tasks like transcription and data analysis, but journalists are responsible for any errors.

Categorized in: AI News Writers
Published on: May 11, 2026
Kansas Reflector pledges to keep AI out of its stories and columns

Kansas Reflector Won't Publish AI-Generated Stories

Kansas Reflector will not publish stories, columns, images, videos or audio created by artificial intelligence. The newsroom's parent organization, States Newsroom, formalized this stance in an official policy released last week.

The policy states: "States Newsroom does not publish stories or commentaries generated by AI. We do not publish any images, videos or audio clips created or altered by AI. If the use of AI is the point of a story in question, AI-generated content may be used, but will be prominently labeled and explained."

A Growing Tension in Newsrooms

The decision comes as news outlets across the country grapple with generative AI's role in journalism. McClatchy, which owns the Wichita Eagle and Kansas City Star, deployed a "content scaling agent" to produce summaries and alternate versions of stories. Reporters pushed back, asking to have their bylines removed from the AI creations.

AI has also become a sticking point in union negotiations at The New York Times and ProPublica.

What AI Can Do Behind the Scenes

States Newsroom's policy allows limited AI use for specific editorial tasks: transcribing audio interviews, analyzing data sets, routine formatting, brainstorming, and reviewing large documents or lengthy meeting videos.

But here's the catch. Reporters and columnists remain responsible for accuracy. If a transcription is wrong, the journalist must catch it. If data analysis is flawed, the journalist must verify it. If a meeting summary misses something important, the journalist must watch it.

"If we get something wrong, it's our own fault," the policy makes clear.

Readers Want Human Voices

Audiences have shown little appetite for AI-generated content. People recoil at clunky AI features bolted onto search engines and websites. Teenagers have even started calling fake or inauthentic statements "AI."

States Newsroom cited accuracy, privacy, and copyright concerns as reasons for its stance. The organization also noted that AI can spread misinformation and create deceptive content.

Journalists work in the truth business. New technology should help locate and spread truth, not hide or distort it.

AI as Just Another Tool

Generative AI isn't entirely new to newsrooms or daily life. Spell checkers, Photoshop's features, radiologist tools, and smartphone shortcuts all employ forms of AI. The difference is how they're used.

There's room for AI to assist journalism without replacing journalists. The line is clear: AI helps with grunt work. Humans do the thinking, reporting, and writing.

Kansas Reflector's readers want to hear from other Kansans, not output from neural networks. Those words come from people who live in the state, work in the community, and share responsibility for what gets published.

They might not be perfect. But they're human.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)