What Does It Mean To "Use AI" as a Writer?
Writers keep arguing about whether using AI is "right" or "wrong." The bigger miss: few agree on what "using AI" even means. Without a shared definition, we talk past each other, waste time, and fall for bait.
Here's a clear, workable line you can actually use in your craft and your client policy.
The Misunderstanding That Fuels the Fight
Two bad takes dominate the feed:
- "A dictionary is AI." No. Looking up a word in the Oxford English Dictionary is not the same as an LLM writing an ebook for you.
- "Typing in Word is AI." Also no. A word processor is a tool. It doesn't originate your ideas or your sentences.
These takes create fake arguments. They block the real discussion: where is the actual line for ethical, professional use?
The Simple Definition Writers Can Agree On
Using AI = using a model to generate the text that appears under your byline.
If an LLM writes your sentences (even if you edit later), that's the core of what people call "cheating." Everything else is secondary.
Green Light vs. Red Light Uses
Green Light (generally acceptable):
- Brainstorming topics or angles when you're stuck.
- High-level research as a starting point, followed by real verification.
- Spellcheck and grammar nudges.
- Loose outlines or headline ideas for commodity topics (you still write the copy).
Red Light (the line you shouldn't cross):
- Letting AI draft paragraphs you intend to publish under your name.
- Heavily "paraphrasing" AI output instead of writing your own text.
- Using AI to summarize sources, then lifting that summary into your piece.
Note: AI "facts" are unreliable. It still hallucinates. If you use it to orient your research, verify everything with primary sources. For context on hallucinations, see this overview from Nature.
Where AI Helps Without Stealing Your Voice
- Idea generation: Prompt for 20 angles, pick 2, discard the rest, and go write.
- Scope a brief: Ask for key points to cover in "rules of roulette" or "counting cards," then replace boilerplate with your experience, data, and voice.
- Research primer: Use AI to map topics or terms you should investigate, then go source the real thing.
The litmus test: if the final sentences are yours, you're fine. If the model authored them, you've crossed the line.
A One-Sentence AI Policy You Can Share With Clients
I never publish AI-generated text; I may use AI for ideation, basic research orientation, outlines, and quality checks-then I write the final copy myself.
That's clear, honest, and enforceable.
The "Sign to Tap" for Writers
- AI can inform. It can't author your work.
- Your name, your sentences. If it's your byline, it must be your words.
- Verify everything. Treat AI outputs like rumors until proven true.
Practical Workflow That Respects the Line
- Kickstart: Generate angles and questions you should answer.
- Research: Collect sources, data, quotes-manually verify.
- Outline: Draft your own structure. If you borrow an AI outline, rewrite it to fit your thesis.
- Write: From scratch. No AI drafting.
- Edit: Use tools for grammar, readability, and catch-alls-but keep your style intact.
For Writers Who Want to Use AI-Without Losing Their Craft
If you want a responsible framework for integrating tools into your process, explore AI for Writers. To get better control over prompts for research, outlines, or editing (not drafting), see Prompt Engineering.
Bottom Line
Strip away the noise and the ragebait, and it's straightforward: using AI to create the text is the problem. Using AI to think better and work faster-while you write the words-keeps your ethics, your edge, and your craft intact.
Your membership also unlocks: