What Lionsgate's New Chief AI Officer Means for Strategy
Lionsgate has created a Chief AI Officer role and named Kathleen Grace to lead it. CEO Jon Feltheimer has put AI on the same level as core business functions, not a side experiment. That's a clear signal: AI decisions will now be tied to P&L, risk and partner trust.
Grace joins the senior leadership team with a remit that touches the full production value chain. Her scope spans production workflows, marketing, distribution and internal operations - where AI can influence cost, speed, quality and compliance.
The mandate, condensed
- Support filmmakers' creative objectives.
- Identify operational efficiencies across business units.
- Establish protective frameworks for studio and partner IP.
Why formal executive ownership matters
"Early AI momentum has rightly come from existing technology teams, but as adoption deepens, dedicated strategic leadership and governance become essential," says Rob King, Director of Business Relationship Management at Sony Pictures. "In a sector where AI can drive huge value, clarity of ownership is no longer a luxury."
That framing captures the move from pilots to scale. Lionsgate is placing AI strategy alongside finance, production and distribution - with decisions made by an accountable leader rather than scattered across IT and innovation labs.
Guardrails front and center
Feltheimer has been explicit: adoption advances only when "appropriate guardrails are established," and with strong "safeguards." Expect policy, approvals and measurement to track with proven standards like the NIST AI Risk Management Framework - including model inventory, data lineage and human oversight in high-impact workflows.
Grace brings operating experience from Vermillio (as Chief Strategy Officer for a GenAI platform) and New Form, where she led development on 40+ pilots and sold nearly 25 series to Netflix, HBO Max, The CW and Freeform. As she noted on LinkedIn: "Working with Vermilio has been incredibly rewarding. I am excited about this next chapter, where I will continue working at the intersection of AI and entertainment, now from the studio perspective."
Track record, criticism and the creative trust gap
This isn't a first step. Since 2024, Lionsgate has partnered with Runway to apply generative video for cinematic production, targeting creative assistance and cost reduction. The move drew pushback from writers, producers and filmmakers concerned about credit, consent and quality.
Feltheimer maintains the company will keep expanding its AI toolkit: "We continue to find exciting new use cases as we apply AI to more areas of our business, increasing our productivity, generating cost savings and expanding our creative tool kit." Still, industry voices split on intent: Jorge Lopes notes, "AI for efficiency and AI for creative trust don't naturally align." Aaron Collins adds, "AI is a two-sided sword; it can be used to kill creatives' careers or to protect and help them. You are now positioned to use it the right way."
What success looks like in the first 12 months
- Production wins tied to P&L: 5-10 use cases with owners, targets and post-mortems (e.g., edit lock time, VFX iteration cycles, ADR turnaround).
- Marketing throughput: faster asset versioning and testing, with lift on engagement or ROAS and clear attribution.
- Rights and consent: creator opt-in/opt-out, likeness and voice consent flows, content provenance (e.g., watermarking) and audit trails.
- Governance in practice: model registry, data sourcing standards, safety evaluations and approval gates for release.
- Vendor discipline: consolidated contracts, risk scoring and cost benchmarks; fewer pilots, more standardization.
- Talent trust: transparent usage reporting to partners and compensation mechanisms where AI touches their work.
Operating model that scales beyond pilots
- Executive council (Production, Post, Marketing, Legal, Labor Relations, Security, Finance) meeting on a fixed cadence with clear RACI.
- AI product managers embedded in business units; central platform team for data, MLOps, security and shared services.
- Human-in-the-loop quality gates for editorial, VFX and marketing outputs; clear escalation paths for sensitive content.
- IP safety stack: dataset controls, license enforcement, provenance and detection; zero-tolerance policy for unlicensed use.
- Workforce enablement: role-based training for crews and staff; approved toolkits, prompts and usage guidelines.
- Creator communications: upfront disclosure, consent options and periodic impact reports to guilds and partners.
Immediate actions leaders can take
- Map top-value workflows across production, post, marketing and distribution; rank by ROI and reputational risk.
- Define red lines for generative use (e.g., no synthetic likeness without consent), plus provenance requirements for release.
- Launch three focused pilots with success criteria locked before build; cut anything that can't show value or safety.
- Update contracts: data rights, residuals/participation for AI-assisted work, likeness and voice clauses, audit rights.
- Stand up an incident playbook for dataset contamination, output errors or IP disputes; rehearse it.
- Set a quarterly review of model and vendor risk; include Legal and Security in every go-live decision.
What this signals for media and entertainment
A CAIO seat means AI choices will be measured by business results and trust with talent. Expect tighter governance, fewer experiments without owners and clearer rules on IP and consent. The studios that win will pair cost discipline with creative credibility - and report on both.
For more executive-level guidance on AI governance, operating models and metrics, see AI for Executives & Strategy.
Your membership also unlocks: