BBC's Muslim Alim urges TV creatives to use generative AI to visualise formats
BBC commissioning editor Muslim Alim is urging TV creatives to get hands-on with AI-specifically to speed up development and make ideas visible fast.
In a recent LinkedIn post, he shared a hypothetical trailer for a horror competition format called "Dead Run" to show what a small stack of tools can produce when paired with focused creative direction.
What Alim showed
"Dead Run" isn't commissioned and wasn't based on a pitch brief. It's a proof of concept: 100 players evade 20 "zombies," with an overseeing "master" controlling power ups and protective shields. The last survivor wins, and contestants are pushed to hide, trade, betray, and survive.
Alim composed the music himself. The build relied on generative tools, then hours of human refinement on tone, characters, pacing, and gameplay.
In his words: "It was simply an idea that became so vivid in my head that I wanted to see if I could actually visualise it. So I built a trailer. All created using a small stack of AI tools and a lot of hours refining tone, characters, pacing and gameplay."
He added: "The strongest ideas don't need this level of visualisation, but it can absolutely help bring a world to life and stress-test whether something really works. As we head into 2026, anyone creating content needs to get comfortable with AI tools, not as shortcuts, but as creative accelerators."
Why this matters for creatives
- Faster development: move from logline to watchable proof in days, not weeks.
- Clearer pitches: align stakeholders on tone, stakes, and pacing without long decks.
- Better stress tests: find broken mechanics before writers' rooms or casting.
- Cheaper iteration: test multiple styles and rulesets without full production overhead.
A simple AI stack you can try this week
- Look and feel: generate moodboards and character art with image tools (e.g., Midjourney, Stable Diffusion).
- Motion: assemble concept footage via generative video or edit stock with cinematic prompts (e.g., Runway, Pika).
- Voices and temp score: use AI voiceover for drafts; consider composing or licensing for the final pass.
- Assembly: cut in Premiere, Resolve, or CapCut; layer SFX and motion graphics for clarity of rules and stakes.
48-hour workflow for a format proof
- Hour 0-2: Write the one-pager-premise, stakes, core loop, win condition, and 60-90 second trailer beats.
- Hour 3-6: Build a moodboard and 10-14 frame storyboard; generate key images for tone and locations.
- Hour 7-12: Create temp VO. Cut a rough with stock, generated shots, and animated cards to explain mechanics.
- Hour 13-24: Iterate pacing. Add SFX, overlays for rules, and 2-3 alt endings to test tension curves.
- Hour 25-48: Share with a small group. Log confusion points. Fix clarity first, polish second.
Keep the human core
- Own the taste calls: tone, character intent, and pacing must be deliberate. AI won't decide that for you.
- Make choices on craft: Alim composed the music himself-consider where your human touch matters most.
Practical guardrails
- Rights and likeness: use licensed or self-created assets. Avoid training or prompts that mimic living artists or actors without consent.
- Disclosure: label AI-assisted drafts internally; set expectations for commissioners and talent.
- Data hygiene: don't upload confidential scripts or pitches to public models without protections.
- Union and guild rules: check current guidance before public use or marketing.
If you're ready to upskill
Want a curated view of tools for previz and concept trailers? Explore a living list of generative video tools and resources here: Generative video tools.
Bottom line: treat AI as a promptable assistant. You bring the taste and the rules; it helps you see the format sooner-and decide if it's worth the budget.
Your membership also unlocks: