800 Creatives Push Back on AI Training: "Stealing Isn't Innovation"
Smokey Robinson, The Roots, and Yolanda Adams have joined nearly 800 writers, musicians, and actors in an open letter calling AI training on copyrighted work "theft." The statement, organized under the Human Artistry Campaign's Stealing Isn't Innovation movement, challenges tech companies that train models on human-made content without permission or payment.
"Big Tech is trying to change the law so they can keep stealing American artistry to build their AI businesses - without authorization and without paying the people who did the work. That is wrong; it's un-American, and it's theft on a grand scale."
"Artists, writers, and creators of all kinds are banding together with a simple message: Stealing our work is not innovation. It's not progress. It's theft - plain and simple."
What the movement wants
The campaign urges licensing, consent, and clear opt-outs for any AI training. "Real innovation comes from the human motivation to change our lives," said Human Artistry Campaign senior advisor Dr. Moiya McTier. "But AI companies are endangering artists' careers while exploiting their practiced craft, using human art and other creative works without authorization to amass billions in corporate earnings."
Meanwhile, tech firms argue that scraping publicly available material can qualify as "fair use." Some are signing licenses (e.g., OpenAI with major media groups; Warner Music Group with AI music generator Suno), but much training activity remains contested.
Why this matters to working creatives
If models are trained on your catalog, your voice, or your style, they can generate close imitations at scale. That pressures rates, confuses credit, and blurs authorship. The core asks are simple: consent, credit, compensation.
Practical steps you can take now
- Lock down your contracts: Add "no AI training/use" clauses for recordings, stems, manuscripts, artwork, and likeness. Require written permission for any dataset use.
- Set web opt-outs: Update robots.txt to block major crawlers (e.g., GPTBot, CCBot, Google-Extended). Add meta tags like noai/noimageai where supported. Label files with content credentials (C2PA) and visible terms banning model training.
- Register your rights: File copyrights for songs, lyrics, scripts, and visual works. Use takedown procedures if your work is cloned or distributed without permission.
- License on your terms: Offer paid, permissioned access for datasets you're willing to share; explicitly prohibit model training where you're not.
- Monitor your footprint: Track datasets and tools that may include your work (e.g., image/music datasets, voice libraries). Request removals where available.
- Organize: Join coalitions and guilds advocating for consent-based training and fair compensation. The Human Artistry Campaign is a starting point: humanartistrycampaign.com
The legal horizon
Lawmakers introduced the No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act in January 2024 to protect people's voices and likenesses against AI-generated fakes. Follow its progress on Congress.gov. Broader questions around training data and fair use remain unsettled and will likely be shaped by upcoming rulings and new legislation.
Bottom line for creatives
AI isn't a blank check to reuse your life's work. Push for consent-based licensing, make your opt-outs visible, and keep your contracts current. Whether courts lean toward fair use or creator control, protecting your catalog and your likeness starts with clear terms and consistent enforcement.
Want to build practical AI literacy without compromising your IP? Explore resources built for your role: Courses by Job.
Your membership also unlocks: