Stealing Isn't Innovation: Hundreds of Creators Rally, Nearly 60 Lawsuits Press AI for Fair Licensing

Hundreds of creators launched "Stealing Isn't Innovation," demanding AI licenses, credit, and pay. About 60 U.S. suits heighten stakes: build with consent-or expect pushback.

Categorized in: AI News Creatives
Published on: Jan 26, 2026
Stealing Isn't Innovation: Hundreds of Creators Rally, Nearly 60 Lawsuits Press AI for Fair Licensing

"Stealing Isn't Innovation": Creatives Unite to Demand Fair AI Licensing

On January 24, 2026, hundreds of prominent artists, actors, musicians, and writers launched a national campaign with a clear stance: training AI on copyrighted work without permission or payment isn't progress-it's theft.

The campaign, "Stealing Isn't Innovation," lands as nearly 60 lawsuits work through U.S. courts, setting up a decisive moment for creators and AI companies alike.

Key Points

  • Campaign launched January 24, 2026, uniting hundreds of top creators against unauthorized AI training on copyrighted works.
  • Nearly 60 active U.S. lawsuits by January 22, 2026, challenge AI firms for using protected content without consent or compensation.
  • Creators are calling for fair licensing deals as the ethical, sustainable path forward.

What the Campaign Asks For

The goal isn't to block AI. It's to make AI responsible. The coalition calls for licensing agreements and partnerships that respect creator rights and pay for use-simple, practical, enforceable.

As noted on the campaign site via TechRadar: "Some of the biggest tech companies … are using American creators' work to build AI platforms without authorization or regard for copyright law. It's not progress. It's theft - plain and simple."

The Legal Pressure Is Building

By January 22, 2026, around 60 lawsuits were active in the U.S., with creators and rightsholders challenging AI companies for training on copyrighted content without consent. The outcomes could define how training data is sourced and licensed for years.

At the core is a simple question: does training on copyrighted material qualify as fair use, or is it a violation? Courts are now being asked to draw the line.

Who's Involved-and Why It Matters

Signatories include Scarlett Johansson, Cyndi Lauper, Common, Joseph Gordon-Levitt, and many others across film, music, literature, and digital media. They're defending more than individual careers-they're defending a sector that fuels jobs, GDP, and American cultural influence.

As quoted by Variety, the message is blunt: "Stealing our work is not innovation. It's not progress. It's theft - plain and simple."

Personal Stakes for Creators

Scarlett Johansson has pushed back on unauthorized use of her likeness and performances, from legal action in 2023 to public criticism in 2024 over AI voice and video imitations. This isn't abstract-it's about control, credit, identity, and livelihood.

Cate Blanchett warned at TIFF 2024 that "innovation without imagination is a very, very dangerous thing." Joseph Gordon-Levitt joined Blanchett in a 2025 open letter urging the White House to uphold copyright law. The creative community is speaking with one voice.

A Better Way Exists

The campaign doesn't reject AI. It asks companies to do what some already do: license content, credit contributors, and build partnerships. This is the clean path that keeps innovation moving while paying the people who fuel it.

Practical Steps for Creatives Right Now

  • Update contracts and rates: Add explicit clauses for AI training, dataset use, and synthetic derivatives. Price them separately.
  • Mark your work: Use clear licensing notices and metadata that state "No AI training without license." Publish terms on your site and portfolios.
  • Control access: Review platforms you use-opt out of AI training where possible, and choose services that respect creator controls.
  • Document everything: Keep timestamps, originals, and version history to prove authorship. Track suspected misuse.
  • Organize: Join guilds, collectives, or associations aligned with the campaign to amplify your leverage with policymakers and platforms.

What AI Companies Should Do

  • License datasets from rights holders and pay for use.
  • Offer opt-in and opt-out controls with transparent reporting.
  • Share economic upside with creators through royalties or revenue shares.
  • Disclose training sources and respect takedown requests.

What's Next

Courts may set the first guardrails, but creators aren't waiting. The coalition is pressing lawmakers and tech leaders to set clear standards: ask, license, credit, pay.

The signal is clear: build AI with consent and compensation, or expect resistance to grow.

Resources


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)