Min-Liang Tan on AI in Gaming: Augment Creators, Don't Replace Them
Min-Liang Tan, CEO and co-founder of Razer, made his position clear: AI should assist game developers, not replace them with what he called "Gen AI slop." The post was later deleted, but the message stands. Razer, through its @RazerAI initiative, is focused on tooling that helps teams ship better games while keeping human creativity in charge.
This reflects a broader shift across tech: use AI to accelerate workflows and raise quality, while keeping people responsible for vision, taste, and final calls.
Why this matters for dev and IT teams
AI is great for grunt work: boilerplate, noisy data, repetitive QA, and content pass-one drafts. Humans should still own systems design, art direction, narrative voice, and player feel.
The winning model is simple: AI proposes; people approve.
Use cases that actually help (without wrecking your art style)
- Art pipeline: Upscale, clean, and batch-generate LODs and materials with model constraints tied to your style guide. Final passes stay in the hands of artists.
- Code assist: Autocomplete, test stubs, and refactors for non-critical paths. Core gameplay systems and netcode stay reviewed and owned by senior engineers.
- QA automation: Bot-driven regression, pathfinding stress tests, and fuzzing for UI flows. Humans handle edge cases and experiential polish.
- Localization: LLM first draft + native editors for quality and cultural nuance. Keep a memory of approved phrasing per franchise.
- Live ops: Player telemetry models for difficulty tuning and churn risk, with strict privacy controls and an opt-in framework.
- Docs and support: Summarize changelogs, generate internal FAQs, and draft modding guides that writers then refine.
How to avoid "Gen AI slop"
- Creative guardrails: Maintain a living art bible and narrative canon. Every AI output must reference them.
- Data provenance: Use licensed datasets and keep a ledger of sources. No gray-area texture packs or scraped VO.
- Human in the loop: Mandatory review gates for art, code, and narrative before content hits main.
- Quality metrics: Define acceptance criteria early (artifact rate, shader stability, frame-time variance, localization QA scores).
- Observability: Log prompts, versions, and outputs for audits. Maintain model cards and usage policies per team.
- Player trust: Be upfront about AI-assisted features where it affects the experience.
Accessibility stays front and center
Tan's view on responsible innovation tracks with his push for inclusive tech, including work adapting Project Motoko for visually impaired users. That mindset pairs well with AI: use it to create more options for players, then validate with real accessibility testing.
- Interface: High-contrast themes, remappable inputs, scalable UI, and consistent iconography.
- Senses: Clear audio cues, TTS/STS options, haptics as secondary feedback.
- Support: Screen reader compatibility and descriptive labels for dynamic UI components.
- Validation: Test with visually impaired players and track completion and frustration metrics per feature.
If you want a solid reference for accessibility fundamentals, the W3C WAI guidelines are a good baseline for engineering and design teams.
What to ship this quarter
- Pick one pipeline stage (e.g., texture cleanup or test generation) and run a 4-6 week pilot. Define success in advance.
- Decide your stack: On-device vs. cloud inference, model size, and cost caps. Keep content within your legal boundary.
- Build review gates: Block merges without human approvals for AI outputs. Automate linting for prompts and metadata.
- Train the team: Short, practical sessions on prompt patterns, failure modes, and security. Document do's and don'ts where people actually work (repos, wikis, CI).
Razer's angle
Razer's focus on developer tooling signals a practical path: give teams better inputs and guardrails, and let creatives do the hard stuff humans do best. You can expect more APIs and integrations across peripherals, performance tooling, and creator workflows that respect the craft.
For developer-facing resources from Razer, start with the Razer Developer portal.
Level up team skills
If your studio is formalizing AI practices for engineering and content teams, these resources can help:
- AI courses by job role for engineers, data folks, and PMs.
- AI certification for coding focused on practical workflows.
Bottom line: use AI to speed up the boring parts and raise your floor, then let your team set the ceiling. That's the difference between a forgettable, AI-smeared build and a game people talk about.
Your membership also unlocks: