Grammarly pulls "Expert Review" after backlash: what writers need to know
Grammarly has shut down its "Expert Review" AI feature after it mimicked the style and voice of well-known writers and academics without their consent. A class-action lawsuit in New York targets Superhuman, Grammarly's parent company, alleging unlawful commercial use of real names, with claimed damages exceeding $5m (£3.7m).
The tool generated editing suggestions "inspired by" people such as Stephen King, Neil deGrasse Tyson, and the late Carl Sagan. Critics say it let the model hallucinate advice while using real identities as a selling point.
Tech journalist Casey Newton, who was included, wrote: "[Grammarly] curated a list of real people, gave its models free rein to hallucinate plausible-sounding advice on their behalf, and put it all behind a subscription … That's a deliberate choice to monetise the identities of real people without involving them, and it sucks."
Vanessa Heggie noted that the historian David Abulafia, who died in January, appeared in the product, calling it "obscene". Investigative journalist Julia Angwin is the lead plaintiff. "Editing is a skill … it's my livelihood," she said, adding she hadn't considered it something that could be stolen. Her lawyer, Peter Romer-Friedman, said more than 40 people reached out within 24 hours of filing.
Grammarly, launched in 2009 as a grammar checker, added generative features last year, including Expert Review, pitched as subject-matter feedback for academic and professional writing. Superhuman CEO Shishir Mehrotra apologised, saying experts raised valid concerns that the agent misrepresented their voices. He said the feature was already being taken down for a redesign and had little usage, while also calling the legal claims "without merit".
Why this matters to working writers
Your name and voice are assets. If an AI claims to channel "your" style to sell feedback, that's a direct hit to your brand, your rates, and your relationship with readers and clients.
There's also law in play. New York's right of publicity protects against unauthorized commercial use of a person's name, voice, signature, photograph, or likeness, including certain protections for deceased individuals. See NY Civil Rights Law §50-f for details.
Read NY Civil Rights Law §50-f
Immediate steps to protect your name, voice, and style
- Audit your tools: Review settings in writing apps and AI assistants. Disable any "style transfer," "voice," or "persona" features that could map or imitate you.
- Update your contracts: Add clear language banning use of your name, likeness, voice, or "style" in products, training, demos, marketing, or agents without written permission and compensation.
- Add a training opt-out: State that your drafts, edits, and deliverables cannot be used to train models or to build personas that resemble you.
- Control your byline: Prohibit vendors from labeling automated feedback as "inspired by" you or presenting AI advice under your name.
- Track provenance: Keep timestamps, drafts, and edit logs. If your name shows up inside a tool, screenshots and logs matter.
- Monitor mentions: Set alerts for your name + "AI," "expert review," "style," "persona," and "voice." Include alternate spellings.
- Escalate fast: If you're impersonated, document evidence, file a written complaint, request removal, and consider legal counsel for right-of-publicity or false endorsement claims.
- For estates: If you manage a deceased writer's rights, review state protections for post-mortem publicity and add explicit licensing terms.
If you use AI in your workflow
You can benefit from AI without putting your reputation at risk. Set ground rules now so you're covered with clients, editors, and platforms.
- Consent-first: Don't invoke real people's names, personas, or "inspired by" prompts without permission.
- Attribution: Be honest about where feedback comes from. Don't label AI output as advice from a living or deceased expert.
- Boundaries: Use generic style targets ("concise, plain language") instead of specific people.
- Documentation: Note when and how AI assisted in edits, especially for sensitive or high-stakes work.
- Client policy: Add a one-page AI policy covering data use, confidentiality, and no-impersonation clauses.
For practical workflows, ethics, and safeguards, see AI for Writers.
Bottom line
Imitation sells, but consent is non-negotiable. If a tool trades on real names and voices, it's a risk-to reputation, income, and legal standing. Protect your identity now, and make sure any AI in your stack respects it.
Your membership also unlocks: