WME Opts All Clients Out of OpenAI's Sora Amid Hollywood Likeness Rights Backlash
WME pulled its roster from OpenAI's Sora, rejecting cameos and likeness use by default. The move pressures platforms on consent, control, and pay as Hollywood tests AI video.

WME Opts All Clients Out of OpenAI's Sora. Here's What That Means for Creatives
WME has told OpenAI to remove all of its clients from Sora's latest update. The talent agency is drawing a clear boundary: no likeness use, no cameos, no assumptions.
Sora now supports dialogue, sound effects, and a "cameos" feature that places a real person into AI-generated scenes. WME's position is simple: artists should choose how their image shows up, and who profits from it.
Why this matters
Tension around AI in Hollywood is rising. Actors, writers, VFX artists, and other creatives see both efficiency gains and real job risk.
Recent flashpoints include an AI-created character, Tilly Norwood, which sparked pushback over training on professional performers without permission or compensation. SAG-AFTRA has been vocal about consent and pay for AI uses of members' work and likenesses. SAG-AFTRA
OpenAI's stance-and what's changing
OpenAI says users control their likeness in Sora's cameos. You can grant or revoke access and remove videos with your cameo at any time. OpenAI
Following pushback, Sam Altman said rights holders will get "more granular control" over character generation and hinted at a revenue-sharing model for those who opt in. He also acknowledged edge cases will slip through and that policy and tooling will need iteration.
Prior to the update, OpenAI contacted agencies and studios, signaling that IP holders would need to explicitly opt out to prevent inclusion. WME's move flips that default for its roster.
The bigger fight
Tech companies lean on fair use arguments for training on publicly available content. Some studios are experimenting with AI for preproduction tasks like storyboarding, while others-including major studios-have filed lawsuits alleging copyright infringement.
Action steps for creatives
- Assert likeness rights in writing. Update contracts, NDAs, and licenses to include explicit AI-use clauses covering voice, image, movement, and biometric data.
- Set a default policy: opt out by default unless there's pay, control, and revocation rights. Apply the same standard to all vendors and platforms.
- Centralize approvals. Route any AI-related requests through your rep or a designated manager. Don't rely on platform settings alone.
- Track usage. Monitor social and video platforms for unauthorized AI versions of your work or image. Use takedown flows early.
- Decide your "cameo" rules. If you ever allow AI cameos, define context, tone, brand guidelines, and where the content can live. Require the ability to revoke.
- Know your union and guild updates. If you're a member, follow official guidance for consent, compensation, and enforcement.
- Talk to counsel before signing platform terms. Auto-consent can hide in TOS updates.
- Experiment with guardrails. If you test AI video tools, use licensed assets, stock with model releases, and private workflows.
- Evaluate revenue-share offers carefully. Model the upside against risk to brand and future negotiating power.
If you're exploring AI video in your practice, you can scan emerging tools and course paths built for creators here: Generative video tools and Courses by job.
Bottom line
WME drew a line so creatives don't have to fight this battle one by one. AI video is moving fast, but control over your likeness and IP is non-negotiable.
Expect more opt-out defaults, tighter controls from platforms, and new licensing models. Until those are real and enforceable, protect your image in writing and choose your use cases with intention.