Artists Against the Slop Beast: Molly Crabapple's Fight to Stop AI From Cannibalizing Creativity
Artists push back as AI mimics styles and replaces paid work. Organize, demand consent and transparency; add AI-off clauses, sell process and community machines can't fake.

Artists Against the Slop Beast: How Creatives Can Push Back on AI Strip-Mining
In 2022, artist Molly Crabapple saw images across the web mimicking her Aleppo skyline drawings and portraits of protesters. Text-to-image tools like DALL.E, DreamStudio, and Stable Diffusion had scraped her work - and billions of others - to spit out sloppy facsimiles. As she put it, the point isn't to match the art, it's to be "good enough" to replace the worker.
That attitude has spread. AI summarizes search results, ghostwrites homework, and poses as a therapist. Critics argue it lifts copyrighted work without consent and erodes our ability to think by outsourcing the creative grind that makes ideas real.
A Growing Pushback
Tech leadership claims AI will soon be unavoidable - wiping out a large share of entry-level white-collar jobs and pushing unemployment much higher. But advancement isn't destiny. Artists are organizing to slow deployment, demand consent, and keep human work valued.
On a rainy night in the Lower East Side, Crabapple joined tech editor Edward Ongweso Jr and the DSA Tech Action Working Group for a workshop: "Artists Against the Slop Beast: How AI is destroying creative work and how to fight back!" Their case: AI is being pushed for profit, at public cost - through mass surveillance, data extraction, and the replacement of labor that people actually want to do.
The fallout is already here. Illustrators are losing gigs as companies test prompts instead of paying professionals. In 2023, Crabapple published an open letter urging media to reject generative AI; over 4,000 signed. As she said, it takes a special contempt for craft to treat human effort as friction.
Meanwhile, companies are using AI to edit stories, summarize sports, and even justify layoffs. Political operators have embraced AI-generated memes for campaigns and messaging. The common thread: scale content, shrink payroll, and hope no one notices the quality drop.
What Creatives Can Do Right Now
- Put "AI-off" clauses in your contracts. Ban training on your work, style mimicry, and AI-generated substitutions without written consent and payment. Add kill fees and penalties for violations.
- Demand dataset transparency. Ask vendors where training data comes from and for provenance logs. Check if your work is in major datasets with Have I Been Trained, then opt out where possible.
- Pass internal rules. In studios, collectives, publications, and classrooms: bar AI for drafts, statements, marketing assets, or pitches. Make the default "human-first;" exceptions require sign-off.
- Shame bad actors. If a brand pushes AI slop, say so - clearly and publicly. "Tell them it looks uncool." Public pushback works; companies back down when the work looks cheap.
- Register your work. Timely copyright registration strengthens your position in disputes and takedowns. Read current guidance at the U.S. Copyright Office: AI.
- Use provenance and watermarks. Keep layered files, timestamps, and visible marks. It won't stop scraping, but it helps you prove authorship and spot fakes.
- Publish with intent. Share finals, not raw bundles. Avoid uploading high-res assets that train well. Stagger releases on platforms that ignore opt-out tags.
- Collectivize your leverage. Coordinate with unions, professional orgs, and tech-worker allies to push platform policies, minimums, and consent standards.
- Sell what AI can't fake. Process, taste, and community. Offer live sessions, commissioned narratives, limited editions, and behind-the-scenes access. Style can be copied; your story cannot.
If Your Team Uses AI Anyway
- No style mimicry of living artists. Ban prompts that cite names or "in the style of" references.
- Human review is mandatory. No AI output publishes without a human editor and clear accountability.
- Label AI assistance. Disclose when AI helped and to what extent. Don't pass machine output as solo authorship.
- Protect your data. Disable product "training" on your inputs. Keep sensitive files off third-party tools.
- Prefer licensed or public-domain sources. Build moodboards from cleared materials; keep a paper trail.
The Line to Hold
The message from organizers is simple: you can refuse slop. Pass rules where you work. Push clients and platforms to get consent, pay fairly, and disclose use. And when companies try to cash in with machine-made knockoffs, call it out - loudly.
Creativity is a practice, not a prompt. Treat it that way, and others will have to as well.