Label a story as AI-written and readers rate it lower, U-M study finds

U-M researchers found readers rate identical writing lower once AI is disclosed-about a 6% hit across 27k people. The bias sticks, raising stakes for how we label creative work.

Categorized in: AI News Creatives Writers
Published on: Feb 01, 2026
Label a story as AI-written and readers rate it lower, U-M study finds

Readers still prefer human-written stories: U-M study finds a persistent penalty for AI-assisted writing

Readers judge creative writing less favorably the moment they find out AI touched it-fully or partly. Same words, different label, lower score.

That's the core finding from new research run across 16 experiments and 27,000 participants (March 2023-June 2024). On average, disclosure that AI was involved reduced evaluations by 6.2%.

The team-led by researchers from the University of Michigan, the Wharton School, and NYU-saw a stubborn pattern: people view AI-written or AI-assisted work as less authentic and, in turn, less worthy of appreciation.

"Sticky" bias, even when the writing is identical

Co-author Justin Berg called the penalty "incredibly sticky." The researchers tried a lot of angles-switching narrative perspective, humanizing the AI, framing it as a collaboration. Nothing reliably reduced the bias.

Participants read AI-generated samples created with ChatGPT (the most recognized model at the start of the study). The disclosure alone did the damage.

Why this matters for writers and editors

Creative work triggers an authenticity filter. If readers suspect a machine, they feel less connected-even if the prose is strong. That has practical consequences as the U.S. considers AI disclosure rules for creative content. Mandated labels could depress engagement and perceived quality.

What you can do right now

  • Lead with authorship, not tooling. If you disclose, make your role explicit: "Written and edited by [Your Name]. Draft support from AI." Keep the human as the protagonist.
  • Show process, not prompts. Share research notes, sketches, drafts, or voice memos. Readers value visible effort and messy edges.
  • Dial up specificity. Use lived experience, sensory detail, and precise references. Specific beats generic-and generic is where AI feels obvious.
  • Keep AI backstage. Ideation, outlines, fact checks, and line-level passes can help. The final voice, structure, and argument should be unmistakably yours.
  • Test your disclosure language. A/B test "AI-assisted" vs. "human-edited" vs. no label (where ethical and permitted). Measure completion rate, shares, and qualitative feedback.
  • Set expectations with clients and editors. Agree on acceptable AI use, disclosure norms, and QA steps (fact checks, plagiarism, and style alignment).
  • Build a signature. Recurring themes, POV, humor, and stories from your life make your work recognizably human.

Policy and platform watch

Lawmakers are weighing disclosure requirements for AI involvement in creative work. If labels become standard, expect some drop in perceived authenticity. Plan for it: strengthen human signals in your work, add process transparency, and reinforce your editorial standards.

Limitations worth noting

  • The study focused on creative writing only.
  • It does not claim AI outputs are more or less creative than human outputs.
  • Attitudes could shift as exposure to AI-written content grows.

If you use AI, use it intentionally

AI can speed up drafts and research, but readers still want a human mind on the page. Treat AI like a studio assistant. You're the author of record.

Sources: University of Michigan News, Journal of Experimental Psychology

Exploring tools for your workflow? See a curated list of AI tools for copywriting.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide