Behind the Work: How an AI-Powered Ayushmann Khurrana turned YouTube daydreamers into Agoda bookings
Agoda met people exactly where they plan trips: right before travel videos on YouTube. One master performance from Ayushmann Khurrana became 250+ hyper-personalized prerolls. The result: 150M+ impressions, 11M+ views, and a 27% lift in click-through rate.
The insight: hit travel intent in the moment
Travel planners use YouTube to research destinations and itineraries. By placing prerolls in front of popular travel content, Agoda found people in planning mode and gave them a timely nudge. Relevance came from context, not guesswork.
The pivot: from physical "portals" to scalable media
The early concept was an on-ground activation: immersive "Agoda Portals" in busy city spots. Smart, but limited. The team shifted to AI-led, search-triggered YouTube prerolls-digital "portals" that scale to millions with precision.
Why Ayushmann Khurrana
Ayushmann's presence adds warmth and familiarity. He leaned in on the idea of AI-replicated voice for personalization, seeing it as a new form of performance that could live across a large media plan-with consent and oversight baked in.
One master film, hundreds of versions
The team built a single, strong narrative designed to flex by destination. Shot on green screen, Ayushmann sits in a car; the window becomes a canvas that jumps to the viewer's searched location. AI handled the variations-swapping destinations, lines, gestures-without reshoots.
Solving the hard parts
- Seamless window transitions: shot planning plus post work to make the car-to-destination shift feel effortless.
- Voice and lipsync: AI voice replication and multiple rounds of model tuning to match tone, timing, and mouth shapes.
- Music: an energetic, memorable track that carries the idea and lingers after the skip button.
The AI stack, in plain terms
GAN-based tools generated consistent variations of Ayushmann's voice, expressions, and micro-gestures. That let the team adapt message and visuals to each user's search intent while keeping the performance human and on-brand.
- GAN explainer (for context): IBM: Generative Adversarial Networks
- YouTube prerolls reference: Google Ads: Video ad formats
Human x AI: the guardrails
Automation handled scale. Humans set tone, performance, visual language, and narrative beats. Every version went through creative review to keep it natural, brand-safe, and true to Ayushmann's personality.
Timeline and workflow
Total duration: about two months. Three phases ran in sequence-production, GAN development, and post. The biggest time sink wasn't the model; it was sourcing and locking footage that paired cleanly with the AI output. Once visuals were locked, the pipeline moved fast.
Results that matter
- 250+ personalized assets from one hero film
- 150M+ impressions and 11M+ views
- 27% jump in CTR
What creatives can borrow (and ship next week)
- Start with a single adaptive narrative: write your script to support interchangeable destinations, lines, or props.
- Direct for consistency: lock tone, pacing, and framing so AI variations stay believable.
- Map search intent to scenes: list top destinations/keywords and pre-write dynamic lines for each.
- Shoot for edit: clean plates, steady framing, and clear eye-lines make swaps invisible.
- Set a strict QA loop: check lips, timing, pronunciations, and emotion per version.
- Pick a mnemonic track: audio is memory glue-keep it consistent across versions.
- Lock rights and consent: name, voice, and likeness approvals up front save pain later.
- Measure and iterate: tie versions to queries, watch watch-time and CTR, promote winners.
Why this matters for media planning
One film for everyone is easy-and forgettable. Personalization based on real behavior (like search) drives attention and trust because it speaks to what the viewer already wants. That shift changes creative, production, and media into one system.
Run this play on your next brief
- Identify 10-20 high-intent queries your audience uses.
- Write a modular script with swappable destinations and lines.
- Shoot one tight performance on green or a clean background.
- Use AI for voice/face/gesture variants; keep humans in review.
- Traffic versions to match keywords and placements on YouTube.
- Optimize weekly: kill low-performers, iterate winners, expand queries.
Credits
Concept and execution by Blink Digital, led by associate creative director Anuj Rathod and creative director Yogesh Shirke, in collaboration with actor Ayushmann Khurrana.
Build your AI-creative edge
If you want structured, practical upskilling on AI for creative work, explore these resources:
Your membership also unlocks: