Wikipedia bans AI-generated content after volunteer editors vote on accuracy concerns

Wikipedia banned AI-generated content after a community vote by its volunteer editors. Founder Jimmy Wales called current AI output a "mess," saying models aren't reliable enough for encyclopedia entries.

Categorized in: AI News Writers
Published on: May 09, 2026
Wikipedia bans AI-generated content after volunteer editors vote on accuracy concerns

Wikipedia Bans AI-Generated Content Following Community Vote

Wikipedia has prohibited contributors from using artificial intelligence to generate or rewrite articles, following a vote by the site's volunteer editor community. The policy takes effect immediately across the platform's 7.1 million English-language articles.

Editors cited a core problem: large language models frequently produce text that lacks verifiable sources or subtly distorts facts. Wikipedia's founding principle requires that all content be traceable to reliable sources.

The ban includes two narrow exceptions. Contributors may use AI for machine-assisted translations and for suggesting minor copy edits to their own writing-provided a human reviews the changes and no new, unverified information enters the text.

What This Means for Writers

If you're a writer considering AI tools, Wikipedia's decision reflects a real constraint: automated systems struggle with accuracy and attribution. Generative AI and LLM technologies can produce plausible-sounding text that fails verification-a problem Wikipedia treats as disqualifying.

The platform's founder, Jimmy Wales, described current AI-generated information as a "mess," saying models are "nowhere near good enough" for drafting reliable entries.

For writers working in fields requiring sourced material-journalism, research, policy work-this distinction matters. AI can assist with revision and translation, but not with generating factual claims. AI for Writers resources can help you identify where AI tools add value versus where human judgment remains essential.

The Broader Context

Wikipedia's choice contrasts with the broader web. ChatGPT and similar tools now exceed Wikipedia in monthly traffic, despite their documented accuracy problems. Wikipedia is betting that human-verified facts will remain more valuable than automated content at scale.

The policy reinforces a simple principle: knowledge work requires accountability. Someone's name goes on the claim. Someone verifies the source. Wikipedia's editors decided that AI generation breaks that chain.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)