AI isn't good enough to write an encyclopedia - and why that matters for writers
Wikipedia co-founder Jimmy Wales isn't surprised by the criticism aimed at Grokipedia, the Elon Musk-backed encyclopedia that launched in October. He's looked at it "a little bit" and points to the trust problem: "I'm not sure anyone would trust an encyclopedia that has a biased attitude."
Musk markets Grokipedia as "the world's largest and most accurate source of knowledge⦠with no restrictions." Wales counters with a simple reality most writers already know: authority comes from process, not claims. If you want readers to trust you, you need verifiable sources, transparent editing, and consistent standards.
What Wales said about Grokipedia
Wales says he starts a dialogue when he doesn't like something about a service. He suggests Musk prefers to change what he doesn't like from the top down. That tracks with reports that, during the Twitter acquisition, Musk pushed the platform to prioritize his own posts, according to former X employees.
The LLM problem: hallucinations kill trust
Wales uses large language models and still sees a constant issue: hallucinations. His take is blunt - "Large language models are really not good enough for writing an encyclopedia." If your work depends on facts, that matters.
Hallucinations aren't a rounding error; they are a structural risk for factual writing. If you use AI for research, you need human verification and source-backed corrections at every step. For context on hallucinations, see this primer on the topic from Wikipedia here.
Grokipedia right now: growth, gaps, and copy concerns
Since launch, Grokipedia has moved from "v0.1" to "v0.2," growing from 885,279 to 1,089,057 articles. Users have noted some pages are almost identical to Wikipedia entries and sometimes mirror Musk's political views. Musk calls it a "modern-day Library of Alexandria" and says he wants to send it into deep space. Big vision, but the work still has to earn trust on the page.
There's also a coverage gap. Grokipedia doesn't offer a Ukrainian-language version. Searching "Ukraine" returns 36 English-only results. For an encyclopedia that aims to be universal, this is a credibility and accessibility problem.
The backstory: Wales vs. Musk
The two have a long-running public clash. Musk once offered a billion dollars to rename Wikipedia to "Dickipedia" and questioned why the Wikimedia Foundation needs significant funding to operate. Wales previously criticized Musk for restricting critics of Turkish President Recep Tayyip Erdogan ahead of key elections, after Musk complied with Turkey's requests to limit content. Coverage of that episode is summarized by the BBC here.
What this means for writers
If you write for a living, treat AI as a powerful assistant, not an oracle. Use it to ideate, summarize, and draft - then lean on your editorial judgment, sourcing, and voice to produce work you can stand behind.
Practical guardrails for your editorial workflow
- Define "AI can help" vs. "human-only" tasks. Research prompts and outlines are fine; final facts and claims require human review.
- Build a source stack. Prioritize primary sources, official documents, and peer-reviewed material over AI outputs.
- Cite as you go. Add links and dates while drafting to reduce lazy errors later.
- Fact-check every claim that could cost you trust, money, or legal pain. Do spot checks even on "obvious" facts.
- Run plagiarism checks. If AI drafting is part of your process, ensure originality and proper attribution.
- Do a bias pass. Ask, "Whose perspective is missing?" and align with your editorial standards.
- Version control. Log changes and corrections so you can show your work if questioned.
A simple workflow you can ship daily
- Prompt for outline ideas and angles.
- Collect 3-5 credible sources per section.
- Draft in your voice; use AI for expansions, examples, and trims.
- Replace every AI-generated fact with a verified source.
- Run a bias and clarity pass; cut anything you can't support.
- Plagiarism and link check; add dates and context.
- Publish, monitor feedback, and update fast when new facts land.
Grokipedia's promise vs. practice
Ambition is easy. Trust is expensive. An encyclopedia - or any reference-grade content - can't skip the slow, sometimes boring parts: sourcing, review, and accountability. That's the gap Wales is pointing at, and it's the same gap writers have to close in their own process.
Further learning for writers using AI
- Curated tools for copywriters that actually save time without wrecking quality: AI tools for copywriting
Bottom line: AI can help you write faster. It can't own the responsibility of being right. That's on us.
Your membership also unlocks: