Javed Akhtar Slams Viral AI Deepfake as Dangerous Fake News, Weighs Legal Action

A viral deepfake falsely recasts Javed Akhtar; he's called it rubbish and is weighing legal steps. Expect swift notices, injunction bids, and platform takedowns to follow.

Categorized in: AI News Legal
Published on: Jan 04, 2026
Javed Akhtar Slams Viral AI Deepfake as Dangerous Fake News, Weighs Legal Action

Javed Akhtar Denounces Viral Deepfake: Legal Stakes, Platform Duties, and a Playbook for Counsel

New Delhi | January 3, 2026 - A viral AI-generated video falsely portrays Javed Akhtar with a religious turn that he has publicly rejected as "rubbish." He has indicated he may pursue legal action against those creating and spreading the clip.

For legal teams, this is a clean example of reputational harm amplified by synthetic media, where impersonation, defamation, and intermediary obligations collide. The incident also spotlights enforcement gaps that can be closed with precise notices, rapid evidence work, and targeted injunctions.

What the clip claims-and why it matters

The video uses a computer-generated image of Akhtar in a topi and asserts he has "turned to God." He has flatly denied it, calling the clip fabricated and dangerous. The intent appears to be to mislead viewers and distort his publicly stated positions.

That combination-false attribution, manipulated imagery, and implied belief-creates a classic defamation and impersonation scenario, heightened by the perceived authenticity of AI output.

Potential causes of action in India

  • Civil defamation: Suit for damages and permanent injunction against creators, distributors, and unknown parties via John Doe (Ashok Kumar) orders.
  • Criminal defamation: Complaint for reputational injury (subject to prevailing criminal provisions and their current numbering in force).
  • Information Technology Act, 2000: Section 66C (identity theft) and 66D (cheating by personation using computer resources) can fit fact patterns like deepfakes; Section 66E (privacy) may also be implicated depending on content.
  • Passing off/right of publicity: Misuse of name, likeness, and persona to attribute statements or beliefs he never made.
  • Intermediary liability touchpoints: Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021-duties on grievance handling, due diligence, and action on court/government orders.

Two quick references: the government's deepfake advisory to platforms and stakeholders, and the IT Rules 2021, which define intermediary obligations.

Liability exposure: creators, amplifiers, and platforms

  • Creators/editors: Primary liability for fabrication, false attribution, and impersonation.
  • Amplifiers: Reposting with knowledge or after notice increases risk; intent and speed of removal matter.
  • Platforms: Safe harbor hinges on due diligence and prompt action on valid court/government orders and compliant grievances. Failure can erode protection.

Immediate counsel checklist

  • Preserve evidence: URLs, account handles, hashes of the files, server times, device time, notarized screenshots, and a short chain-of-custody note. Consider a quick OSINT capture and a hash-based archive.
  • Technical review: Retain a forensic expert for media analysis (frame-level artifacts, audio-visual inconsistencies, metadata). Keep the report simple and court-ready.
  • Identify actors: Trace the earliest uploader and high-reach resharing accounts. Capture their profile data and identifiers before they rename or delete.
  • Platform notices: File detailed complaints via Grievance Officers. Include the legal basis (defamation, impersonation, Section 66C/66D), URLs, and hashes. Ask for takedown plus preservation of records.
  • Court relief: Seek an ex parte injunction, a John Doe order against unknowns, and a dynamic injunction to cover re-uploads and mirrors.
  • Law enforcement: File a cybercrime complaint citing applicable provisions. Request subscriber information preservation and expedited assistance.
  • Escalation plan: If spread crosses borders, prepare Mutual Legal Assistance Treaty (MLAT) or Letters Rogatory pathways and narrow, specific data requests.
  • Communications: Issue a concise public statement to reduce speculation; align messaging with filings to avoid inconsistency.

Intermediary duties and timelines to leverage

Under the IT Rules, intermediaries must follow due diligence, run grievance mechanisms, and act on court or government orders "expeditiously." Significant social media intermediaries must maintain compliance officers, publish transparency reports, and offer notice-and-takedown workflows.

  • Grievance flow: Acknowledge user complaints within 24 hours and resolve within 15 days (or faster for specified categories).
  • Orders: Removal must follow lawful court directives or Section 69A government blocking orders. Attach certified copies in your notice.
  • Evidence: Ask platforms to retain logs and metadata to prevent spoliation while the matter is investigated.

Risk controls for clients

  • Provenance and watermarking: Adopt C2PA-style provenance where feasible; maintain original source files and creation logs for quick rebuttals.
  • Account hardening: Verified handles, 2FA, and published media kits to help the public distinguish genuine releases.
  • Monitoring: Set alerts for name, likeness, and keyword combinations; track first-seen copies and high-velocity shares.
  • Response drills: Pre-approve notice templates, affidavit shells, and a shortlist of expert declarants for fast filings.
  • Training: Brief spokespersons and team leads on how to respond without amplifying the fake.

If your in-house team is building AI literacy for incidents like this, a structured curriculum helps. See curated options by role here: AI courses by job.

What to watch next in Akhtar's case

He has signaled a likely legal course. Expect a combination of platform notices, a request for urgent injunctions, and a complaint with the cyber cell. If a court grants a dynamic order, re-uploads can be taken down faster, and platforms will be on clear notice.

Beyond this incident, the signal is strong: deepfake misuse invites swift civil and criminal exposure, and courts are increasingly receptive to early, sweeping orders where evidence preservation and precise pleadings are in place.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide