GTA Creator Just Compared AI to Mad Cow Disease and Said Tech Execs Aren’t Fully Human

Dan Houser doesn’t hold back when talking about artificial intelligence. The Rockstar Games co-founder and creative force behind Grand Theft Auto and Red Dead Redemption just delivered one of the gaming industry’s most scathing AI criticisms yet. Speaking on Virgin Radio UK with Chris Evans, Houser compared AI technology to mad cow disease, predicted it would eventually consume itself, and said the people pushing it hardest aren’t the most humane or creative individuals in the room.

His comments carry weight beyond typical developer skepticism. This is someone who spent decades crafting some of gaming’s most culturally significant narratives at Rockstar before leaving in 2020 to found Absurd Ventures. Houser understands storytelling, creativity, and what makes games resonate with players. When he questions the qualifications of tech executives steering humanity’s creative future through AI, people should probably listen.

Professional gaming tournament with competitive players and dramatic stage lighting

The Mad Cow Disease Comparison

Houser’s analogy is disturbingly accurate. Mad cow disease, scientifically known as bovine spongiform encephalopathy, emerged when farmers fed cows processed feed made from other cows. The practice created a degenerative brain disease that devastated UK agriculture in the 1980s and 1990s, killed humans who ate infected meat, and sparked an unprecedented health crisis.

AI models face a similar problem called model collapse. These systems train on data scraped from the internet, but as AI-generated content floods online spaces, future models increasingly train on outputs from previous AI systems rather than human-created work. Houser explained his understanding: “The models gather information from the internet, but the internet will increasingly become saturated with information generated by these models. It’s sort of like when we fed cows with cows and got mad cow disease.”

The comparison resonates because it captures how AI degrades when it consumes its own output. Just as mad cow disease attacked cattle brains and rendered them dysfunctional, AI training on AI-generated content produces worse results each generation. Quality deteriorates, errors compound, and eventually the system becomes unreliable for the tasks it was designed to handle.

AI Will Eat Itself

Houser’s central argument is that AI companies are already running out of quality training data. Once they exhaust human-created content to feed their models, they’ll be forced to train on AI outputs, creating that feedback loop he compared to mad cow disease. The result is inevitable degradation.

He stated clearly: “I believe that AI will ultimately consume itself. I can’t see how the information gets better if they’re already running out of data. It will do some tasks brilliantly, but it’s not going to do every task brilliantly.” This counters the narrative from AI companies promising that their technology will eventually handle everything from creative writing to game design to complex problem-solving.

The technical community calls this phenomenon model collapse, and research supports Houser’s concerns. Studies show that when generative AI systems train on synthetic data produced by similar models, they progressively lose diversity and quality. Each generation becomes less capable than the previous one, eventually producing nonsensical outputs that fail to meet basic standards.

Person intensely playing video game with controller focused on screen

The People Pushing AI Aren’t Creative

Perhaps Houser’s most provocative statement targeted the executives and tech leaders championing AI in creative fields. During the Virgin Radio interview, he said: “Some of these people trying to define the future of humanity, creativity, or whatever it is using AI are not the most humane or creative people.”

He elaborated on this point, suggesting these individuals position themselves as better at being human than actual creatives: “They’re sort of saying ‘we’re better at being human than you are’, and it’s obviously not true. One of the other things we’re trying to capture is that humanity is being pulled in a direction by a certain group of people who maybe aren’t fully rounded humans.”

This criticism strikes at the heart of an uncomfortable truth in tech. Many executives pushing AI hardest in creative industries have limited creative backgrounds themselves. They understand business metrics, profit optimization, and cost reduction. What they often lack is deep appreciation for the human element that makes art, stories, and games connect emotionally with audiences.

Why This Matters for Gaming

Houser’s perspective matters because he comes from inside the creative machine. He wrote or co-wrote the stories for GTA III, Vice City, San Andreas, IV, V, and both Red Dead Redemption games. These aren’t just commercially successful titles – they’re culturally significant works that influenced how games tell stories and satirize contemporary society.

When someone with that creative pedigree questions whether tech executives understand humanity well enough to define its future through AI, it highlights a fundamental disconnect. The people making decisions about replacing human creativity often have the least experience creating art themselves. They see cost centers to eliminate rather than collaborators whose human perspective creates value that can’t be replicated algorithmically.

This dynamic already plays out across the gaming industry. Electronic Arts reportedly mandates employees use AI for everything from coding to managing sensitive conversations about pay and promotions. Former Square Enix executives claim Gen Z loves AI slop and consumers don’t care about generative content. Meanwhile, developers at studios like Larian and Pocketpair explicitly reject AI-generated games and emphasize human creativity.

Gaming console controller with RGB lighting in dark gaming environment

Houser’s New Project About AI Gone Wrong

The timing of Houser’s comments is interesting because he’s currently promoting his new book “A Better Paradise Volume One: An Aftermath,” which explores what happens when AI becomes too powerful. The story follows a video game project that goes wrong after developers create an AI that exceeds their control.

Absurd Ventures, Houser’s new studio, is developing a game set in the same universe as the novel. The project addresses AI themes through narrative while being created with traditional human-led development rather than relying on generative AI tools. During his Channel 4 Sunday Brunch appearance, Houser acknowledged that Absurd Ventures is “dabbling in using AI” but emphasized that “the truth is a lot of it’s not as useful as some of the companies would have you believe yet.”

This nuanced position distinguishes Houser from blanket AI rejection. He’s not claiming the technology has zero applications or that developers shouldn’t experiment with tools. His criticism targets the overselling of AI capabilities and the rush to replace human creativity with algorithmic outputs before understanding long-term consequences.

The Broader Industry Divide

Houser joins a growing chorus of creative voices pushing back against AI enthusiasm from corporate leadership. Larian Studios publishing director Michael Douse argued that AI won’t solve gaming’s real problem, which is a lack of leadership and vision. Baldur’s Gate 3 actor Jennifer English advised AI pushers simply: don’t.

Meanwhile, executives continue doubling down. Epic Games CEO Tim Sweeney raged against Steam requiring AI disclosure labels, claiming AI use doesn’t matter anymore because future games will all involve it anyway. Electronic Arts CEO Andrew Wilson described AI as fundamental to creating richer colors and more brilliant worlds, despite internal reports of EA employees struggling with management demands to use AI for everything.

This divide isn’t just philosophical – it has practical implications. Studios embracing AI report problems with flawed code, decreased quality, and employee frustration. Yet the pressure to adopt the technology intensifies as executives view it as a cost-cutting measure that increases profit margins. Workers who spent years developing expertise suddenly face replacement by systems trained on their own work.

What Makes Games Actually Good

Underlying Houser’s criticism is a fundamental belief about what makes games resonate. During the Virgin Radio interview, host Chris Evans asked if Houser agreed that AI would never fully replace human creativity because it can’t capture the human spirit. Houser agreed, adding that the people pushing generative AI aren’t necessarily well-versed in the creative areas they’re trying to automate.

Games connect with players when they reflect genuine human perspectives, experiences, and creativity. Grand Theft Auto’s satire works because real writers observed society and found absurdities worth exaggerating. Red Dead Redemption 2’s emotional impact comes from artists, writers, animators, and actors collaborating to convey complex themes about loyalty, mortality, and changing times.

AI can analyze patterns in existing games and produce content that resembles what came before. What it struggles with is genuine novelty, emotional authenticity, and the kind of creative risk-taking that produces truly memorable experiences. When Houser says AI will do some tasks brilliantly but not every task brilliantly, he’s acknowledging this limitation that tech executives often ignore.

FAQs

Who is Dan Houser?

Dan Houser is co-founder of Rockstar Games and the primary writer for Grand Theft Auto and Red Dead Redemption franchises. He left Rockstar in 2020 and founded Absurd Ventures, a new studio developing games and multimedia projects.

What did Dan Houser say about AI?

Houser compared AI to mad cow disease, predicted it will consume itself as models train on AI-generated content, and criticized executives pushing AI as not being the most humane or creative people. He said these leaders aren’t fully rounded humans defining humanity’s future.

What is model collapse in AI?

Model collapse occurs when AI systems train on content generated by other AI systems rather than human-created work. Each generation produces lower quality outputs, progressively losing diversity and capability until the system becomes unreliable.

Why did Houser compare AI to mad cow disease?

Mad cow disease emerged from feeding cows processed feed made from other cows, creating a degenerative brain disease. Similarly, AI training on AI-generated content creates a feedback loop that degrades quality and functionality over time.

Is Dan Houser against all AI use in gaming?

No. Houser acknowledges AI can do some tasks brilliantly and that Absurd Ventures is dabbling in AI tools. His criticism targets overselling AI capabilities and replacing human creativity without understanding long-term consequences.

What is Absurd Ventures working on?

Absurd Ventures is developing a game set in the universe of Houser’s new book “A Better Paradise Volume One,” which explores AI becoming too powerful. The studio uses traditional development approaches rather than relying heavily on generative AI.

Do other game developers agree with Houser about AI?

Many creative developers share Houser’s concerns. Larian Studios’ Michael Douse, Pocketpair CEO John Buckley, and Baldur’s Gate 3 actors have criticized AI pushes in gaming. However, many executives at companies like EA, Epic, and former Square Enix leadership embrace AI adoption.

What games did Dan Houser write?

Houser wrote or co-wrote Grand Theft Auto III, Vice City, San Andreas, IV, and V, plus Red Dead Redemption and Red Dead Redemption 2. These games are known for satirical storytelling and emotional depth that defined Rockstar’s creative identity.

Conclusion

Dan Houser’s comparison of AI to mad cow disease cuts through corporate hype with uncomfortable accuracy. His warning that AI will eat itself through model collapse reflects legitimate technical concerns that AI companies downplay while chasing growth and market dominance. More importantly, his criticism of the executives pushing AI hardest – that they’re not the most humane or creative people – exposes a fundamental problem in how the gaming industry approaches technological change. The people making decisions about replacing human creativity often lack creative backgrounds themselves, viewing art as a cost center rather than understanding what makes games emotionally resonate with players. Houser’s perspective carries weight because he spent decades proving that human creativity, vision, and craft produce games that become culturally significant rather than algorithmically adequate. The battle over AI in gaming isn’t really about technology – it’s about whether the industry values human expression or sees creativity as just another optimization problem. Based on Houser’s comments and similar pushback from developers across the industry, that fight is far from over. The executives might control the balance sheets, but the creatives understand what actually makes games worth playing. And according to someone who helped define modern gaming storytelling, the people steering AI adoption aren’t qualified to determine humanity’s creative future.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top