Embracer CEO backs ethical AI-creative catalyst, not replacement-as company targets nine AAA releases
Embracer's new chief backs AI to speed production, human-led and ethical by design. Target: nine AAA releases; transparency, consent, and review are non-negotiable.

Embracer's new chief says AI can boost creativity-if ethics lead the way
Embracer Group expects its Fellowship Entertainment division to ship nine AAA titles over the next two fiscal years. To get there, newly appointed CEO Phil Rogers says the company will lean on generative AI to accelerate production-while committing to a strong ethical framework.
The pitch is simple: AI should clear bottlenecks so creatives can spend more time crafting what players remember. The company's stance: empower, never replace.
Where Embracer says AI helps today
Studios inside Embracer are using AI across animation, design, engineering, and asset creation. Rogers claims they're already seeing measurable upticks in productivity and faster iteration cycles.
He highlighted animation breakthroughs-citing recent tech that can produce results close to motion capture, trimming a seven-day shoot to roughly half that time. Tools in this area, such as NVIDIA's work on real-time facial and character animation, show where pipelines are heading. See NVIDIA Audio2Face.
Key idea: keep a human in the loop. AI gets you to a strong first pass; human tuning brings the craft, taste, and consistency your audience expects.
Ethics as policy, not PR
Rogers emphasized transparency with players about where and how AI is used. Embracer plans strict governance with audit logs and workflow approvals to track usage and approvals across teams.
That promise faces a real test. Embracer subsidiary Aspyr recently shipped Tomb Raider IV-VI Remastered with unauthorized AI-generated voice work mimicking French Lara Croft actress FranΓ§oise Cadol, which was then hotfixed out. It's a clear example of why consent, provenance, and human review aren't optional-they're day-one requirements.
The industry mood is mixed
While leadership teams see speed and scale, many developers remain wary. The latest GDC State of the Game Industry survey showed a more skeptical view of tools like Copilot, ChatGPT, and image models compared to last year.
Fears of replacement are real-fueled by high-profile examples and the industry's painful layoff cycle. Embracer itself became a headline name for studio closures and cancellations under its previous chief, who now serves as executive chair focused on strategic initiatives and capital allocation. Context matters: adoption without trust won't stick. For reference on broader sentiment, see GDC's research hub.
What this means for creatives
If you lead a team-or your own solo practice-treat AI as a draft engine and an accelerator, not an author. Your taste, story sense, and style are the differentiators. Use AI to remove grunt work so you can push quality and experimentation.
- Set a written policy: where AI is allowed, where it isn't, and what must be reviewed by humans.
- Protect people and IP: get explicit consent for voices and likenesses; avoid training on unlicensed datasets; respect union and contract terms.
- Track provenance: keep audit logs, model/version records, and approvals attached to assets.
- Label usage: be upfront with clients and players about AI-assisted content when material.
- Define quality bars: AI output must meet the same standards as human work before it ships.
- Measure impact: pick metrics that matter (iteration time, bug rate, asset throughput) and review monthly.
Practical workflows that actually help
- Animation: use AI for previsualization and blocking; finalize with hand-keyed passes and performance notes.
- Writing and narrative: draft alt barks and side content, then edit for voice, tone, and lore consistency.
- Art and assets: generate mood boards and low-fidelity concepts; lock style guides before scaling up.
- Engineering: rely on code assistants for boilerplate and tests; enforce reviews and security checks.
- QA and design: spin up bots for basic playthroughs; keep human testers focused on feel, pacing, and edge cases.
Risk checklist for studios and freelancers
- Consent and likeness rights: voice actors and performers need explicit agreements for synthesis and reuse.
- Dataset due diligence: know what your models were trained on; avoid tainted or unclear sources.
- Model choice: favor tools with enterprise controls, logging, and content filters.
- Localization and accessibility: review AI-assisted text and VO with native speakers and specialists.
- Avoid filler: don't ship generic quests, art, or VO that dilutes brand voice; quality beats quantity.
For creatives looking to upskill
AI won't write your story or animate your scene with taste-but it will get you to a better first draft faster. Level up your prompts, guardrails, and review habits, then plug those into your daily workflow. Curated training can speed that transition: explore role-based programs here: AI courses by job.
Bottom line
Embracer is betting that AI can compress schedules and free teams to chase better ideas-without sidelining human authorship. That promise only works if consent, governance, and taste stay in the driver's seat.
For creatives, this is a practical invitation: automate the repetitive, double down on the distinctive, and ship work you stand behind.