New York Enacts AI Disclosure and Postmortem Likeness Laws Aimed at Ads and Entertainment
New York Gov. Kathy Hochul signed what her administration calls the nation's first AI-focused bills aimed at the media and entertainment sector. One law requires clear disclosure when an advertisement includes AI-generated synthetic performers. A second, related law requires consent from heirs or executors before using a deceased individual's name, image, or likeness for commercial purposes.
What the laws cover
The advertising measure is straightforward: if an ad uses an AI-generated synthetic performer, the producer or creator must disclose it. Expect this to apply across TV, digital, social, and out-of-home placements where a "performer" appears to be a real human.
The postmortem provision clarifies commercial use of an individual's identity after death. Use without consent from heirs or an authorized representative would be prohibited, aligning with the broader push to curb deceptive digital replicas.
Context: Why this matters now
The move follows months of lobbying and mounting concern about generative tools in production workflows and advertising. California passed a different AI law earlier this year, but its focus is broader on safety, not the specific needs of media and entertainment.
At a signing event hosted at SAG-AFTRA's New York headquarters, Gov. Hochul framed the balance this way: "We do want to embrace innovation⦠But not to the detriment of people. That has to be the dividing line." Duncan Crabtree-Ireland added that guardrails on "digital replicas and synthetic creations" are essential to maintain public trust.
Immediate takeaways for legal teams
- Audit creative pipelines: Identify any use of AI-generated performers in advertising assets (including background characters, voice doubles, or composites).
- Implement disclosure workflows: Add a disclosure step to creative briefs, production checklists, and media trafficking so notices ship with the asset.
- Update agreements: Insert representations, warranties, and indemnities around AI use for agencies, post houses, VFX vendors, and media partners.
- Talent provisions: Clarify consent for digital replicas, voice models, and body doubles. Add separate consent captures for synthetic uses.
- Recordkeeping: Maintain audit trails for prompts, models used, training data provenance (where feasible), and approvals that support disclosures.
- Claims review: Align ad copy and disclosures with truth-in-advertising standards to avoid deception, especially where synthetic performances could mislead.
- Postmortem rights: Build a process to verify authority from heirs or executors before using deceased individuals in campaigns, titles, or promotions.
- Escalation: Create a cross-functional review (legal, production, brand, privacy) for any campaign involving synthetic performers or digital replicas.
Contract language to consider
- Definition of "synthetic performer" and "digital replica," tracked to statutory language once published.
- Mandatory disclosure obligations on vendors when AI tools are used to generate a performance or likeness.
- Consent terms for talent (including scope, duration, revocation, compensation triggers for synthetic use).
- Postmortem use: proof of authority, scope of permitted exploitation, territorial limits, and takedown obligations.
- Indemnities for unauthorized use of name, image, or likeness, and for failure to provide required notices.
- Media-specific obligations (CTV, social, UGC ads, influencer content) ensuring the disclosure travels with the asset.
Enforcement and risk posture
Details on enforcement and effective dates will be in the final statutory text and any agency guidance. Until then, assume plaintiffs will test claims around deceptive practices, false endorsement, and right of publicity where disclosures are missing or unclear.
For national campaigns, harmonize New York's rules with existing federal guidance on AI-related advertising claims and deception. The FTC's recent posts on AI and advertising offer helpful direction on avoiding misleading representations.
Industry signals
The signing drew strong support from SAG-AFTRA leaders, who emphasized that the laws aim to protect artists while allowing innovation. As Crabtree-Ireland put it, transparency and consent are now baseline expectations, not afterthoughts.
With studios and streamers eyeing 2026 negotiations, the legal standard set in New York is likely to influence contract templates and negotiating positions well beyond the state. Counsel should expect higher scrutiny on digital replicas, deepfake influencers, and AI spokespersons used in marketing.
Action checklist
- Map where AI shows up in your content supply chain (script, voice, performance, post/VFX, localization).
- Create standardized disclosure language for synthetic performers across formats and placements.
- Refresh talent, influencer, and licensing templates to cover replicas and postmortem rights.
- Set a gating review before release for any creative that may trigger the disclosure requirement.
- Train marketing, production, and vendor managers on the new obligations and escalation paths.
As the final bill text and guidance become available, legal teams should verify definitions, thresholds, and remedies, then tune contracts and workflows accordingly. Aligning early will cut rush fixes later and reduce the risk of takedowns or disputes mid-campaign.
New York Governor's press releases
If your team needs a fast primer on AI tools and use cases that often surface in content production, these curated resources can help you spot risk earlier: Courses by job.
Your membership also unlocks: