AI's Free Ride on Creative Labor Is Breaking the Market
Free markets work because ownership means something. You create, you own, you sell. Simple. When AI systems scrape your work to train a model that competes with you, that contract collapses.
OpenAI's "Sora 2" can spin up movie-quality video from a text prompt. Impressive engineering, sure. But if those outputs lean on vast libraries of copyrighted film and TV without licensed access, that's not progress. That's creative arbitrage.
As Hollywood reports continue to suggest, these models were trained on material built by people who financed, produced and protected their work under law. None of it was offered as free fuel. The incentive that funds the next project gets hollowed out when yesterday's work becomes tomorrow's training set for free.
Charles Rivkin of the Motion Picture Association said it straight: "You can't build a new business model on stolen property." Property rights aren't relics. They're the backbone of a free market. Remove them and what's left is digital squatting.
We're told this is "learning," not copying. But when a model ingests millions of works to mimic style and structure, the line gets thin. Call it what it is: replication without permission, scaled to infinity.
The harm is here, not hypothetical. Agencies warn clients about real risks to jobs and income. Independent filmmakers, designers, writers - anyone publishing online - face a workflow where their work is cloned in seconds, stripped of context, and monetized by firms that never paid to make it.
This isn't just a Hollywood problem. Face and voice cloning can target anyone. A jealous ex, a bitter coworker, or an opportunist can impersonate, embarrass or damage a reputation. That means protecting creative property and personal identity should sit at the same table.
Where Markets Start to Crack
Markets depend on fair exchange. You build something scarce and can sell it without someone taking it first. When models copy at scale, scarcity collapses. Small studios can't compete. Individual artists can't license what's already been scraped.
Consumers lose, too. Quality follows incentive. Remove the reward and you don't get better art - you get noise.
If a drug company copied a competitor's formula and called it "learning chemistry," we'd call it theft. If a startup mined a carmaker's blueprints and claimed fair use, same story. Why should creative work be treated differently?
Technology should grow markets, not drain them. A healthy economy pays creators, respects ownership and plays by the same rules for everyone. No one should have to compete against their own unpaid clone.
Accountability Beats Bans
We don't need a regulatory sledgehammer. We need clear lines. Copyright applies whether infringement is done by a person or by code. If you train on protected works, you pay for access - the way studios pay for music rights or image libraries. And transparency isn't optional. Creators and consumers deserve to know when "original" outputs ride on unlicensed inputs.
For context and ongoing policy updates, see the U.S. Copyright Office's resources on AI and copyright here. For provenance tooling, explore open standards like C2PA content credentials here.
What Creatives Can Do Right Now
- Register your works. It strengthens your position if you need to enforce rights.
- Add provenance and rights metadata (e.g., content credentials) and visible or invisible watermarks to new releases.
- Use "no-train/no-scrape" clauses in contracts and platform terms. Ask clients and distributors to honor them.
- Negotiate synthetic use rights separately (face, voice, style). Put compensation and consent in writing.
- Prefer platforms that offer dataset opt-outs and publish training disclosures. Vote with your uploads.
- Monitor for misuse with reverse image/video search and content tracking tools. Document, then act.
- Diversify revenue with direct-to-fan channels and memberships to reduce platform risk.
- Join collectives or guilds pushing for fair licensing frameworks and model transparency.
- Upskill on rights-aware AI workflows so you can use the tools without giving away the store. Curated options by role are available here.
The Line We Have to Hold
We can celebrate useful tools and reject a system that treats creators as raw material. Letting AI companies rewrite ownership rules is a shortcut to monopoly, not growth.
Work has value. Protect that principle and we keep markets free, accountable, and human. Abandon it and value becomes whatever an algorithm can copy.
Your membership also unlocks: