Fake News Exposed: Government Debunks AI-Generated Video Attributing Statement to EAM Jaishankar
On March 10, 2026, government fact-checkers debunked a viral video falsely attributing statements to External Affairs Minister Dr S. Jaishankar. The video claimed he said the nation would not tolerate Muslim countries harassing Israel, and that Israel granted 3 billion dollars to the Afghan Taliban at India's request. The Fact Check Unit of the Press Information Bureau confirmed the video is AI-generated and the claims are fake. No such statement was made by the External Affairs Minister.
What was claimed
- EAM Dr S. Jaishankar allegedly warned against Muslim countries "harassing Israel."
- Israel allegedly gave 3 billion dollars to the Afghan Taliban at India's request.
What the government confirmed
- The circulating video is AI-generated and misleading.
- The statements attributed to the External Affairs Minister are fabricated and were never made.
- Citizens should not share the clip and should rely on verified, official sources.
Why this matters for public servants
Fakes of this nature can distort public understanding, trigger diplomatic confusion, and erode trust in official communications. One misleading clip can outrun formal channels within minutes. Your role: slow the spread, verify fast, and push clear, consistent updates.
Immediate actions for departments
- Route all media queries to your designated communications lead; avoid off-the-cuff comments.
- Share the official correction across your internal channels and social media handles.
- Tag and reference the official fact-check to give staff and the public a single source of truth. See: PIB Fact Check.
- Flag impersonation accounts or edited media to platform trust-and-safety teams through official reporting flows.
- Coordinate with MEA/PIB before issuing any department-level statements that touch foreign policy.
How to vet suspect videos in under 10 minutes
- Source check: Is the clip from an official handle or verified press event? If not, treat as unverified.
- Cross-reference: Look for the same quote on official transcripts, press releases, or verified posts.
- Visual tells: Watch for lip-sync drift, unnatural blinking, warped edges near the mouth and eyes, and mismatched lighting or shadows.
- Audio tells: Robotic timbre, odd breaths, or background noise that doesn't match the setting.
- Technical checks: Run reverse image/video searches; examine upload timestamps and account history.
- Document and escalate: Save links, screenshots, and hashes; escalate to your fact-checking or cyber team.
If your team needs a quick primer on how these fakes are produced and detected, see Generative Video.
Suggested holding statement for departments
"We are aware of a video circulating online that misattributes statements to [Official/Ministry]. The content is fabricated. Please refer to verified updates from our official channels and the government's fact-check unit. Sharing misinformation harms public trust and policy clarity."
Build resilience inside your organisation
- Stand up a rapid verification cell with clear SLAs (e.g., triage in 15 minutes, decision in 60).
- Keep an updated runbook: escalation tree, legal review triggers, and approved message templates.
- Maintain a single, searchable archive of debunked items for staff reference.
- Train spokespersons on short, repeatable lines that reinforce verified sources and discourage speculation.
For training and frameworks built for the public sector, explore AI for Government.
Bottom line
The claims in the viral video are false, and the footage is AI-generated. Anchor your communications to official fact-checks, move fast on takedowns and clarifications, and keep your teams trained to spot and stop synthetic media before it shapes the narrative.
Your membership also unlocks: