PIB Fact Check Debunks AI-Altered Video Falsely Attributing Remarks to Army Chief Upendra Dwivedi

PIB fact-checkers flagged an AI-altered clip wrongly tied to Army Chief Gen Upendra Dwivedi. It's fake-use official channels and report links if it surfaces.

Categorized in: AI News Government
Published on: Nov 05, 2025
PIB Fact Check Debunks AI-Altered Video Falsely Attributing Remarks to Army Chief Upendra Dwivedi

Government debunks fake AI video of Army Chief Upendra Dwivedi

A digitally altered video circulating on social media falsely attributed a statement to Indian Army Chief, General Upendra Dwivedi. The Fact Check Unit of the Press Information Bureau confirmed the clip was created using artificial intelligence and amplified by Pakistani propaganda accounts. No such statement was made by General Dwivedi. Officials and citizens are urged to rely on verified, official sources.

What happened

A fake video used AI to mimic a message and tie it to the Army Chief. PIB's Fact Check team reviewed the content and labeled it manipulated. The advisory is straightforward: treat the clip as false and refrain from sharing or engaging with it.

Why this matters for government personnel

Misinformation targeting senior leadership can trigger confusion, erode trust, and create avoidable operational noise. Quick verification and a consistent response across departments protect public confidence and support national security objectives.

Immediate steps for departments

  • Do not share, repost, or comment on the fake video from official handles.
  • Route all media queries to your department's media cell or spokesperson.
  • Issue a brief clarification on official channels, referencing PIB Fact Check.
  • Report the content links to platform grievance mechanisms and your cyber cell.
  • Log incidents: source link, time, platform, actions taken, and escalation details.

Verification tips for suspected AI videos

  • Look for lip-sync drift, unnatural blinking, or mismatched lighting and shadows.
  • Check audio for robotic tone, abrupt cuts, or background that doesn't match the setting.
  • Cross-verify with official handles before reacting. If it's real, there will be a release.
  • Use reverse image/video search for earlier versions of the clip.

Official sources

Recommended capability-building

For teams strengthening AI literacy and misinformation response playbooks, see curated options by role here: Complete AI Training - Courses by Job.

Bottom line: the video is fake, the attribution is false, and the record has been corrected. Keep communications tight, use official sources, and document your response if the clip surfaces in your channels.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)