AI Sings Like Kishore Kumar: Tribute or Digital Grave-Robbing as Millions Tune In and the Law Lags

An AI-cloned Kishore Kumar track went viral, igniting a fight over tribute versus exploitation. Laws lag; experts push consent, licenses, labels, watermarks, and platform checks.

Categorized in: AI News Creatives
Published on: Jan 06, 2026
AI Sings Like Kishore Kumar: Tribute or Digital Grave-Robbing as Millions Tune In and the Law Lags

Digital grave-robbing or creative tribute? AI-cloned Kishore Kumar vocals rack up millions-and a moral bill

An AI-altered version of Saiyaara went viral with a voice that sounded like Kishore Kumar. Not a remix. Not a cover. A recreation of a legend who passed away decades ago.

The track pulled 20-30 million views on YouTube and spread across social feeds, funneling ad revenue to whoever posted it. That momentum has set off a hard question for India's music scene: where does creative play end and exploitation begin?

The ethical fault line

Veteran singer Shaan called the cloning of iconic voices "cruel" and "unfair." You can copy pitch and texture. You can't copy the life that gave the voice its weight.

That's the heart of it. AI can simulate the sound. It can't carry the history, the imperfections, or the meaning behind it.

The legal gray in India

India doesn't clearly recognise personality or publicity rights in statute. Courts have offered limited protection to living celebrities against misuse of likeness or voice, but that protection weakens after death.

As Dhruv Garg of IGAP notes, posthumous control over a singer's voice is shaky at best, and it's unclear how courts will handle large-scale cloning. That gap invites copycats and confuses platforms.

Innovation vs. consent

Kazim Rizvi of The Dialogue urges a measured view. He sees generative AI opening new creative doors while acknowledging real concerns about consent and legacy.

His stance: don't ban the tech-govern it. Build clear, predictable rules for posthumous rights, and use claim-based notice-and-takedown so tribute, creativity and lawful use can exist without trampling dignity.

"Digital grave-robbing," or a new format?

Cyber and AI laws expert Advocate (Dr.) Prashant Mali is blunt: this is "digital grave-robbing." Recreating a legend's voice without consent or license turns legacy into a commodity and monetises a reputation that can't fight back.

He calls for mandatory licensing, watermarking of AI voices, and platform liability for hosting unverified synthetic content. His bottom line: personality rights shouldn't die with the person; heirs or estates should hold them, or they should be guarded as cultural heritage.

The commercial imbalance

AI-cloned tracks often ship without licensing, attribution or revenue sharing. Originals, composers and rights holders get nothing while synthetic versions rake in views.

Platforms, meanwhile, struggle to detect what's legitimate, what's a cover, and what's a synthetic impersonation. At scale, that confusion becomes a business model.

The cultural cost

Younger listeners may begin to associate AI-generated performances with real artists. Over time, that blurs historical authenticity with algorithmic imitation.

If we're not careful, collective memory gets edited by whichever version goes most viral.

What working creatives can do right now

  • Use consent as a baseline: get written approval from estates or rights holders before cloning a deceased artist's voice.
  • License everything: clear composition, publishing and master rights. If in doubt, don't release.
  • Label clearly: disclose "AI-generated voice in the style of [Artist]" in titles, descriptions and audio watermarks.
  • Keep provenance: save prompts, models and training sources. You'll need proof if a takedown or dispute hits.
  • Offer revenue share: if an estate exists, propose splits and caps. Treat it like a commercial sample.
  • Avoid identity confusion: never imply endorsement, collaboration or authorship by the deceased artist.
  • Develop your own voice models: train on licensed datasets and vocalists who consent, then brand the sound as yours.

What labels, publishers and estates can put in place

  • Posthumous rights policy: define voice and likeness use, approval flows and pricing for licensing.
  • Standard licenses: fast, transparent terms for AI voice use, including scope, territory, duration and attribution.
  • Whitelist/blacklist: approve certain use cases (tribute, education) and prohibit others (misleading ads, political content).
  • Technical signals: require inaudible watermarks and metadata tags for any licensed AI output.
  • Takedown playbook: clear notice templates, evidence thresholds and escalation paths with major platforms.

What platforms should do next

  • Mandatory disclosure: creators must label synthetic or cloned vocals; noncompliance triggers reduced reach or removal.
  • Rights registries: verify estates and rights holders; enable pre-clearance for licensed AI uses.
  • Detection and watermarking: support open watermark standards and invest in voice-clone detection at upload.
  • Revenue holds: escrow ad revenue on disputed tracks until ownership and consent are verified.
  • Repeat-offender penalties: stronger strikes and monetisation bans for serial impersonation.

A path forward

AI will keep pushing into creative work. The solution isn't fear or blind hype-it's consent, clarity and accountability.

Set rules that respect legacy and still let new formats thrive. Do the paperwork. Label the work. Share the upside with the people who built the culture.

Helpful references

Level up your AI practice-without crossing lines

If you're a creator or producer building an AI workflow, get smart on consent, licensing and disclosure before you ship. It's faster than cleanup after a strike-or a lawsuit.

Explore courses by job to build ethical, market-ready AI skills that won't put your catalog at risk.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide