Jared Harris Issues Cease-and-Desist Over Podcast Deepfake, Citing Creators' AI Concerns

After a podcast ran an AI deepfake, Jared Harris sent a cease-and-desist to protect his likeness. Bottom line: get consent, tighten contracts, and clearly label synthetic media.

Categorized in: AI News Creatives
Published on: Mar 05, 2026
Jared Harris Issues Cease-and-Desist Over Podcast Deepfake, Citing Creators' AI Concerns

Jared Harris Issues Cease-and-Desist Over Podcast Deepfake - A Clear Line for Creatives

Jared Harris has moved to protect his likeness after an AI-generated deepfake of him appeared in a new podcast. His lawyers issued a cease-and-desist to the team behind "Films Not Made," hosted by Oscar-nominated producer Amy Hobby ("What Happened, Miss Simone?") and director Avi Zev Weider ("Welcome to the Machine").

"My image is protected by contract and settled State and Federal law. I have not ceded that right," Harris said. The message is simple: consent isn't optional, and your name, face, and voice are not free assets for anyone's content pipeline.

Why this matters for creatives

AI has made cloning a person's voice and likeness cheap and fast. That convenience doesn't cancel legal rights. In many places, your "right of publicity" covers how your identity is used commercially - including synthetic versions meant to sound or look like you.

Even if a deepfake is framed as "homage" or "fair use," you still need explicit permission for likeness and voice use. Contracts, platform policies, and state laws can all apply - and the penalties can outweigh any short-term reach or novelty.

If you're a performer or creator: protect your likeness now

  • Get it in writing: Require clear, written consent for any use of your image, voice, or a synthetic version of either. No implied permissions.
  • Update your contracts: Add clauses that forbid training on your work and using your likeness/voice synthetically without approval. Include revocation rights and penalties.
  • Control your voice data: Limit high-quality raw samples you release. Watermark where possible and keep originals archived.
  • Own your brand signals: Register trademarks for your name/stage name where applicable and monitor for impersonation.
  • Have a response plan: Save a cease-and-desist template, track evidence (timestamps, URLs), and use platform takedown tools for impersonation or unauthorized likeness use.

If you're a podcaster or producer: reduce your legal risk

  • Always secure consent for voices and likenesses. Synthetic or "parody" isn't a shield.
  • Use licensed voices and models, and keep the license terms on file. Log prompts, outputs, and approvals.
  • Label synthetic media clearly and avoid implying endorsement where none exists.
  • Audit episodes pre-release for potential likeness or brand confusion. If in doubt, cut it or get written approval.

The takeaway

Harris drawing a hard boundary is a preview of how this plays out across film, TV, audio, and advertising. Consent, contracts, and clarity are the new baseline. If your work touches someone's identity - real or synthetic - treat it like licensed IP.

Next steps for creatives

  • Review your current agreements and add AI/likeness language where missing.
  • Set a simple policy for inbound requests: what you approve, what you refuse, and how you want credit and compensation handled.
  • Educate your team so no one "tests a model" with your voice or face without clearance.

Want deeper, practical training on this shift? Explore AI for Creatives and learn how to protect your work while using AI with intent. For voice-specific tools and defense strategies, see Voice Modulation.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)