Why Take-Two's CEO Thinks AI Can't Build the Next Grand Theft Auto

Take-Two's Strauss Zelnick says AI can assist, but it can't meet Rockstar's standard or secure IP on its own. Use it for workflows, keep humans on story, taste, and final calls.

Published on: Oct 29, 2025
Why Take-Two's CEO Thinks AI Can't Build the Next Grand Theft Auto

AI won't replace human game creators, says Take-Two's Strauss Zelnick

AI is shaking up creative work across gaming, film, and music. It's also stirring up real risk: copyright landmines, deepfakes, and content that feels derivative.

Strauss Zelnick, CEO of Take-Two Interactive, is clear on the line. AI can be a tool, but it can't match Rockstar's creative bar. "There is no creativity that can exist by definition in any AI model, because it is data-driven," he said at a tech executive summit in New York City.

The IP wall: AI-made content isn't protectable

Zelnick's first constraint is legal. "We have to protect our intellectual property… If you create intellectual property with AI, it's not protectable." That has big implications for studios, publishers, and anyone shipping content at scale.

AI companies need vast data sets. Rights holders want consent, credit, and compensation. That clash has produced licensing deals, lawsuits, and policy debates. See the U.S. Copyright Office's updates on AI authorship for where the law stands today: copyright.gov/ai.

The release of OpenAI's Sora, which can generate near-realistic short videos from prompts, added fuel to the fire. It raised fresh concerns about deepfakes and the use of someone's likeness or voice without permission. For reference: openai.com/sora.

Tools vs. taste: where AI helps-and where it falls short

Zelnick's second constraint is creative quality. Even with no legal limits, pushing a button won't deliver a "Grand Theft Auto" marketing plan-or a world that feels alive. "You end up with something pretty derivative," he said.

Why? Models are trained on past data. They predict. That's useful for tasks that benefit from pattern matching. It's weak for original IP, humor, voice, and the kind of taste that defines Rockstar's work.

Practical uses for AI in game development

  • Content pipeline: concept variations, mood boards, temp VO, placeholder art, level grayboxing.
  • Production: code assistants, localization drafts, test case generation, bug triage, analytics queries.
  • Player ops: support macros, moderation queues, dynamic FAQs, basic sentiment tagging.

Where to avoid overreliance: core narrative, character design, key art, brand voice, and marketing strategy. Keep humans in the driver's seat. Use AI to speed options, not define the final cut.

Guardrails for IT, engineering, and product leaders

  • Rights and consent: document training data sources, licenses, and voice/likeness approvals.
  • Provenance: watermark AI assets; track prompts, models, and outputs in an audit log.
  • Human review: mandate editorial sign-off for anything external-facing.
  • Security: isolate model access; restrict sensitive code/content in prompts; rotate API keys.
  • Quality: define "assist" use cases with clear acceptance criteria; test for sameness and bias.
  • Compliance: align with copyright guidance and local privacy laws; prepare takedown workflows.

Why Take-Two doubles down on originality

Take-Two is one of the last major public game publishers following Microsoft's $69B acquisition of Activision Blizzard in 2023 and Electronic Arts' recently announced $55B deal with the Public Investment Fund of Saudi Arabia, Silver Lake, and Affinity Partners.

Their strategy: build franchises that last. Take-Two counts 11 series with at least five million units sold at launch and more than 20 popular mobile titles. "Grand Theft Auto V" generated $1 billion in its first three days back in 2013, and the next GTA is slated for May 2026.

As Zelnick put it, Rockstar aims for work that "approaches perfection." That's taste, judgment, and cultural timing-things models imitate, but don't originate.

Action plan: adopt AI without losing your creative edge

  • Define "assist only" zones: code help, QA, localization drafts, internal docs.
  • Ban model-generated final assets for story, characters, key art, and trailers unless fully re-authored by humans.
  • Require proof of rights for voices, faces, likenesses, and datasets.
  • Track every asset's source and model version; enable instant rollback.
  • Measure throughput gains (time saved, defect rates) rather than "originality."
  • Train teams on prompt quality, bias checks, and data hygiene.

Level up your team's AI judgment

If you're standardizing AI across product, engineering, and marketing, start with training that focuses on safe use, measurable outcomes, and creative guardrails. Curated catalogs help teams pick the right tool for the job and avoid legal traps.

Explore practical resources and tool roundups here: Generative video tools and AI courses by job.

Bottom line: use AI to speed the work. Keep humans accountable for ideas, taste, and the final story. That's where the value is.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)