Tim Sweeney: AI flags on game store pages "make no sense"
Epic Games CEO Tim Sweeney says labeling games as "made with AI" on store pages is pointless. His view: AI will be part of almost every production pipeline going forward, so the label becomes noise rather than signal.
Responding to a post arguing that stores should drop the "Made with AI" tag, Sweeney agreed and clarified where such disclosures do matter. He sees value in authorship and licensing contexts-art exhibits, asset marketplaces-where buyers must assess rights and provenance. But for consumer storefronts, he says it's the wrong layer to solve a process problem.
When pushed that "customers deserve to know," he joked that by that logic stores could also list the developer's shampoo brand. The point: disclosures should be useful, actionable, and tied to real risk-not blanket labels that don't tell players anything material.
Where platforms stand today
Since early 2024, Steam has required developers to disclose whether generative AI is used and explain how. That statement appears on the store page under "AI Generated Content Disclosure."
Typical examples note the use of procedural or AI-based tools during production, with a qualifier that final assets reflect the development team's creative direction and review. This gives players who avoid generative AI a way to factor that into a purchase decision.
The Epic Games Store does not require these disclosures, and based on Sweeney's comments, it doesn't look likely to add them.
Why this matters for engineering and studio ops
- Expect AI everywhere: Treat AI as standard tooling across art, code, QA, and live ops. Labels won't manage risk; internal governance will.
- Governance over banners: Focus on data rights, training sources, model selection, and review workflows. Keep audit trails for who used what, when, and on which assets.
- Platform compliance as code: If you ship on Steam, build disclosure text from your CI/CD metadata (e.g., tags on assets touched by genAI). Automate it so it stays accurate.
- IP and licensing checks: Validate dataset provenance, vendor terms, and output rights. Document approvals for any third-party models or datasets.
- Human-in-the-loop: Require human review on AI-generated art, audio, narrative, and code. Log acceptance criteria and reviewers for traceability.
- Live content safeguards: For UGC or on-device generation, add guardrails, rate limits, and escalation playbooks. Test prompt injection and content filters.
- Player communication: Some customers avoid AI. Offer a concise FAQ explaining where AI is used, what's human-reviewed, and how you handle rights.
The practical takeaway
Sweeney's bet is simple: AI will be the default in game production, so store badges won't tell players much. The real work is building clear governance, meeting platform rules, and communicating honestly about process and quality.
If you're formalizing your team's AI skills and standards, you can browse curated options for engineering roles here: AI courses by job and AI certification for coding.
Your membership also unlocks: