Sci-Fi Writers and Comic-Con Are Drawing a Hard Line Against Generative AI
Two big signals just landed: SFWA's Nebula Awards and San Diego Comic-Con have moved to block AI-generated work. If awards, shows, or pro credits matter to you, treat this as policy, not noise.
What changed at SFWA and the Nebulas
In December, SFWA updated Nebula rules. An initial draft suggested authors could disclose LLM use and let voters decide. Backlash was immediate. SFWA apologized: "Our approach and wording were wrong, and we apologize for the distress and distrust we caused."
The final revision is clear: any work created wholly or partially by LLMs is ineligible. Any use of AI at any stage disqualifies a submission. Writer Jason Sanford supported the move and flagged the real gray area: many "normal" tools now embed AI. The policy is firm; definitions still need care so honest writers aren't hit by collateral damage.
Comic-Con's art show closes the door
Comic-Con initially allowed AI art to be displayed but not sold. After artist pushback, the rule changed to: "Material created by Artificial Intelligence (AI), either partially or wholly, is not allowed in the art show."
Glen Wooten, who runs the show, said stricter language was now needed given the surge in AI content. His stance: "NO! Plain and simple."
Why this matters for working writers
Originality and authorship are the currency of awards, juried shows, and pro markets. The signal from SFWA and Comic-Con is direct: if AI touched the work, it's out. Expect more organizations to align with similar rules this year.
That means your process is now part of your pitch. Not just the prose on the page, but how you made it.
Action steps to protect your submissions
- Audit your workflow. List every tool you use. Turn off AI features in editors, grammar checkers, and search add-ons for award or show submissions.
- Keep drafts human-written. No LLM drafting, paraphrasing, rewriting, or line edits. For art, avoid AI-generated covers or concept pieces if the market bans them.
- Document your process. Save dated drafts. Keep a simple log of sessions and major changes. Screenshots beat debates later.
- Verify rules before you submit. Read guidelines line by line. If anything is unclear, email the organizer and keep the reply. Build a one-page policy sheet per market.
- Separate tasks. If you use AI for admin work (market research, formatting), keep it outside any file that will be submitted. If a market says "any use," don't use it at all.
- Update contracts and collab norms. Add a "no AI" clause for co-authors, editors, and designers when targeting awards or shows with bans.
- Strengthen human feedback loops. Critique groups, beta readers, and editors replace what some sought from AI tools-and they leave a clean audit trail.
Where this is heading
Policies will tighten, then clarify. Expect more venues to ban AI-assisted work for awards and juried showcases while allowing it elsewhere. The safest approach: if you plan to submit for recognition, keep AI out of the creative stream entirely.
If you're exploring AI for non-submission work (marketing, research, office tasks), learn the rules and keep your files clean and separate. For skill development and policy awareness, you can review AI courses by job to decide what belongs in your workflow-and what doesn't.
The Takeaway
Creative institutions are drawing clear lines. SFWA and Comic-Con say AI-generated or AI-assisted work won't qualify for awards or shows. If credit and career capital matter, write and submit human-only work, document your process, and keep your tools honest.
Your membership also unlocks: