Scarlett Johansson, R.E.M., and Vince Gilligan join 700+ creators in push to stop AI theft

Scarlett Johansson and 700+ artists back a campaign urging tech to stop training AI on their work without consent. They want licensing, transparency, and rules against deepfakes.

Categorized in: AI News Creatives
Published on: Jan 23, 2026
Scarlett Johansson, R.E.M., and Vince Gilligan join 700+ creators in push to stop AI theft

Creatives push back on AI scraping: "Stealing isn't innovation"

Scarlett Johansson, R.E.M., Vince Gilligan, and more than 700 artists have signed onto a new campaign, "Stealing isn't Innovation," calling out tech companies for training AI on their work without consent. Their message is simple: stop using creative labor as free fuel and start licensing.

"America's creative community is the envy of the world and creates jobs, economic growth and exports," the campaign states. "But rather than respect and protect this valuable asset, some of the biggest tech companies are using American creators' work to build AI platforms without authorization."

The group calls it an "illegal intellectual property grab" that's flooding culture with "misinformation, deepfakes and a vapid artificial avalanche of low-quality materials ['AI slop']." Their stance: stealing isn't innovation, it's unpaid extraction at scale.

Why this matters

Some AI leaders have argued it's "impossible" to train models without copyrighted material because "copyright covers virtually every sort of human expression." Creatives disagree-especially when their likeness, voice, or catalog shows up repackaged as slop, spam, or worse.

Johansson previously threatened legal action in 2024 over a ChatGPT voice that closely mirrored hers. Meanwhile, reports continue to surface about AI systems generating deepfakes and sexualized images of real people, underscoring the harm curve if consent and accountability are missing.

What the artists want

  • Consent-first data use: opt-in by default, not opt-out tricks.
  • Licensing and payment for training and synthetic uses of creative work.
  • Transparency on training data, datasets, and model capabilities.
  • Clear guardrails against impersonation, cloning, and deepfakes.
  • Provenance signals so audiences can tell what's synthetic and what's real.

What working creatives can do now

  • Put it in writing: add "no AI training/use without consent" to contracts, invoices, and site terms. Negotiate AI clauses for voice/image likeness, prompts, and datasets.
  • Register your work and keep clean archives. Paper trails matter for claims and licensing. See the US Copyright Office's AI resources for evolving guidance: copyright.gov/ai.
  • Add technical signals: robots.txt, x-robots-tag, and meta "noai/noimageai" tags. Not all companies honor them, but some do-and it strengthens your position.
  • Use content credentials to attach verifiable provenance to images, audio, and video. Start with the open standard here: c2pa.org.
  • Watermark and monitor. Set up reverse image/video searches and name alerts to catch misuse early.
  • License on your terms. Offer clear paid paths for training, reference, or derivative use-don't leave "maybe" on the table.
  • Report deepfakes and impersonations fast. File takedowns, notify platforms, and loop in unions or legal support where possible.

If you choose to use AI, do it on your terms

Plenty of creatives use AI as an assist without giving away rights or voice. The key is consent, attribution, and contracts that pay you for what's yours. If you're exploring AI in client work, learn the tools and the boundaries so you don't undercut your own value.

Start with practical skills that keep you in control: AI courses by job.

The bottom line

This isn't anti-tech. It's pro-creator. Consent, credit, cash-if a model needs our work, it should license it. Stealing isn't innovation.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide