800 Creatives Rally Against "AI Slop": Stealing Isn't Innovation
Roughly 800 artists, writers, actors, and musicians have signed a new campaign calling out "large-scale theft" by AI companies, according to The Verge. The roster includes George Saunders, Jodi Picoult, Cate Blanchett, Scarlett Johansson, R.E.M., Billy Corgan, and The Roots. The message is blunt: copying creative work without consent or payment isn't progress.
As the campaign states, "Driven by fierce competition for dominance in new GenAI technology, profit-hungry tech companies⦠have copied vast amounts of creative content from the internet without authorizing or compensating the creators." The result, they warn, is an information ecosystem swamped by misinformation, deepfakes, and an artificial flood of inferior material-"AI Slop."
Who's backing it
The Human Artistry Campaign is supporting the push, bringing together groups across entertainment and sports. That includes the Recording Industry Association of America (RIAA), unions for professional athletes, and performer unions like SAG-AFTRA. Expect full-page ads, social campaigns, and consistent pressure on platforms and policymakers.
What the campaign wants
- Clear licensing agreements for training and use of creative works
- A workable enforcement environment that actually deters abuse
- A true opt-out right for artists who don't want their work used to train generative AI
Why this matters for working creatives
Unlicensed scraping doesn't just hit your pocket; it undercuts your brand when fakes and low-effort replicas saturate feeds. As deepfakes and synthetic content multiply, attribution blurs and trust erodes. That's bad for discovery, bad for rates, and bad for long-term careers.
Licensing is becoming the uneasy truce
At the federal level, President Donald Trump and allied tech voices are pushing to centralize control over how states regulate AI-often pressuring states that pursue stricter rules. At the industry level, former opponents are meeting in the middle: rights holders and tech firms are signing more licensing deals that allow AI training and remixing under defined terms.
Major record labels are partnering with AI music startups to open catalogs for remixes and training. Digital publishers-some of whom have filed suits-are pushing for licensing standards that keep their work out of AI search unless paid for. Several media companies have already cut deals that let chatbots display their content.
Practical steps to protect your work now
- Lock down your terms: add explicit AI training and synthesis clauses to your contracts, briefs, and licensing pages.
- Control access: use robots.txt and server rules to block known AI scrapers and set clear usage permissions where possible.
- Prove provenance: use content credentials/watermarking and keep dated source files; this helps with disputes and takedowns.
- Register your work: formal registration strengthens your position on enforcement and damages. See the U.S. Copyright Office's AI resources.
- Track misuse: set alerts for your name, titles, and distinctive phrases; document evidence before you file complaints.
- Negotiate AI rights upfront: fees, credit, opt-in scope, dataset transparency, and deletion terms if consent is withdrawn.
Use AI on your terms
You can experiment with AI and still protect your catalog. Favor tools that disclose sources, offer commercial-safe models, and respect opt-outs. If you're exploring visual workflows, here's a curated starting point: AI tools for generative art.
The bottom line
This campaign isn't anti-tech; it's pro-consent and pro-compensation. Licensing, opt-outs, and enforceable rules don't stifle progress-they make creative ecosystems sustainable. If AI companies want our work, they can ask, license, and pay-like everyone else.
Your membership also unlocks: