Business Insider lets reporters draft with AI, won't tell readers
Business Insider will allow AI-written first drafts without routine disclosure; reporters remain responsible for the final piece. The move may spread and test reader trust.

Business Insider reportedly OKs AI for first drafts - without reader disclosures
Business Insider has reportedly told its journalists they can use AI to draft stories and won't routinely disclose that use to readers. The guidance, shared in an internal memo cited by the media newsletter Status, positions the outlet among the first to formally approve AI as a writing aid at scale.
The memo from editor-in-chief Jamie Heller reportedly frames AI as "like any other tool" for research, writing assistance, and image editing. An FAQ allowed AI-assisted first drafts but required that the final story be the reporter's own work. Disclaimers would be reserved for fully AI-generated or unvetted content, according to the report.
Why this matters for working writers
Policies like this will spread. Editors want speed, consistency, and lower production costs. If major publications normalize AI-assisted drafting without disclosures, freelancers and staffers will be pushed to match output while keeping quality high.
The catch: accountability doesn't change. Your name sits on the byline. If a model fabricates facts, misquotes sources, or lifts phrasing too closely, you own the error. That means you need a tight process-faster drafts, stronger verification.
What BI reportedly changed
- AI is allowed for research, outlining, drafting, and image editing.
- Reporters can use AI for first drafts, but the final copy must be theirs.
- No routine reader disclosure for AI assistance; labels only for fully AI-generated or unvetted material.
- Writers remain fully responsible for accuracy and originality.
A practical workflow that protects your name
- Set intent and angle: Write a one-paragraph brief with the thesis, target reader, must-include sources, and non-negotiable facts.
- Outline with AI, not facts: Ask for structure and questions to answer. Prohibit invented citations. List real sources you will verify.
- Draft fast, then rewrite slow: Let AI produce a rough pass. Rewrite in your voice. Change structure. Replace generic claims with reported details.
- Verify every claim: Cross-check names, dates, numbers, and quotes with primary or reputable sources. If you can't verify it, cut it.
- Source hygiene: Add links to original documents or credible outlets. Avoid circular citations and untraceable "reports say" language.
- Originality check: Run a plagiarism/overlap scan. Replace boilerplate phrasing. Attribute unique ideas.
- Hallucination sweep: Search for any source, study, or quote mentioned by the model. If it doesn't exist, remove and replace.
- Image integrity: For any AI-edited visuals, follow your client's standards and keep edit notes. Label synthetic images where required.
- Process log: Keep a simple note of what AI did (outline, draft, headlines) and what you did (reporting, rewrites, edits). This protects you in edits or disputes.
Freelancers: contract checkpoints
- Ask if AI use is permitted and whether disclosure is required.
- Clarify responsibility for errors from AI suggestions-indemnity and kill-fee terms matter.
- Confirm expectations on sourcing, originality thresholds, and image manipulation.
Risks to manage right now
- Source theft and attribution: Models can echo phrasing from training data. Attribute ideas and quotes properly.
- Confidentiality: Don't paste embargoed or sensitive notes into third-party tools without permission.
- Style drift: AI tends to flatten voice. Schedule a human pass dedicated to rhythm, specificity, and story logic.
- Speed traps: Faster drafts can hide shallow reporting. Set a minimum list of sources and documents per story.
Industry backdrop
Newsrooms are split on AI. Some see efficiency; others see legal and trust risks. Business Insider has been experimenting already-appointing an AI newsroom lead, testing an AI search tool, and operating under parent company Axel Springer's licensing deals with tech firms like OpenAI and Microsoft.
The stakes are clear: AI has raised questions about business models, training data, and liability. And BI itself reportedly dealt with AI-generated stories from a supposed freelancer earlier this summer-proof that governance, not just access, decides outcomes.
Questions to ask your editor or client
- Which story parts can use AI (outline, draft, headlines, translation, image edits)? Which cannot?
- Do we disclose AI assistance to readers? If yes, where and how?
- What are the sourcing and verification requirements for AI-suggested claims?
- Which tools are approved, and how is data handled?
- What is the policy on synthetic images or AI-edited photos?
Bottom line
AI can shorten the distance from idea to draft. It does not reduce your duty to report, verify, and write with a distinct voice. If your name is on it, own the process end to end.
Helpful resources
- Society of Professional Journalists Code of Ethics
- Associated Press guidance on AI use
- AI tools for copywriting (curated list)
- AI courses by job: options for writers and editors