itsme
Novice
- Joined
- Apr 22, 2024
- Messages
- 66
https://www.bloomberg.com/news/news...9.qopDytLFnUY5oOkR9UB2NLBxokJ4yJH0HqzkZP5_dvw
Why So Many Video Games Cost So Much to Make
Graphical fidelity is only part of the reason that game budgets have swelled to hundreds of millions of dollars
Hi everyone. Today we’re breaking down how video-game budgets got so big, but first...
Wasteful spending
Not too long ago, I had coffee with a video-game developer who told me that work was slow and that they’d been spending half of their days watching Netflix.
For a second I was stunned — this person worked for a major corporation worth billions of dollars — until I remembered how many times I’d heard similar stories.
There was the developer who couldn’t work because the game’s tools weren’t ready. There was the team that had to drop everything they were doing because the creative director had played Breath of the Wild over the weekend and came away with some Great Ideas. There were the artists who were blocked from working as they waited for a colleague to finish a design.
In other words, it’s not uncommon for professional video-game makers to find themselves spinning their wheels for prolonged periods, during which they get paid to do very little work.
I was thinking about this bizarre phenomenon while reading a recent story in the New York Times about how costs in the video-game industry are ballooning. The story, titled “Video Games Can’t Afford to Look This Good,” aptly points out that the industry’s long-standing pursuit of high-fidelity graphics has led, over time, to diminishing returns. But it also pins the recent wave of bloated budgets and mass layoffs on this frenzied quest for greater graphics — an analysis that is a little bit off the mark.
Let’s zoom out for a second. One objective fact is that video-game budgets have grown massively. For example, Naughty Dog’s Uncharted 2: Among Thieves, released in 2009, cost $20 million. The studio’s most recent game, 2020’s The Last of Us Part II, cost $220 million.
But the truth is that graphical fidelity is just one part of the equation. To understand why video-game budgets have grown so rapidly, you have to understand where that money is actually going: paying people’s salaries. A small part of a game’s budget might go to miscellaneous costs like office rent and computer equipment, but the vast majority is earmarked for labor.
Budget estimates vary based on location, but each employee in a pricey city like Los Angeles could cost anywhere from $15,000 a month to $20,000 a month, a figure that includes salaries, benefits and overhead.
Let’s do some quick napkin math. If you have 100 employees and you’re estimating $15,000 a month (a conservative guess) for each one, you’re spending $18 million a year. But these days, the top game studios are much bigger than that. So if you have 300 employees and you’re estimating $20,000 a month for each one (got to pay good wages to compete in 2025), you’re spending $72 million a year. (The real math is much more complicated, since people move on and off projects all the time, but hey, we’re just estimating.)
Another data point is that games take much more time to make. The gap between Uncharted 1 (2007) and Uncharted 2 (2009) was two years. The gap between Naughty Dog’s most recent two games, Uncharted 4 (2016) and The Last of Us Part II (2020), was four years.
Put those numbers together and it’s easy to see why budgets have grown tenfold. One Activision executive’s deposition in a recent lawsuit, dug up by journalist Stephen Totilo, revealed that Call of Duty: Black Ops III (2015) cost $450 million, Call of Duty: Modern Warfare (2019) cost $640 million and Call of Duty: Black Ops Cold War (2020) cost $700 million — not particularly shocking numbers in the context of Activision’s recent revelation that more than 3,000 people work on the franchise. (Those numbers also appear to include post-release content.)
Budgets for games are now enormous because those two vectors — more people, more time — have grown so significantly over the last decade. Graphical fidelity is a part of that, to be sure. In general, you need more artists and engineers to make games look more detailed and photorealistic. But games also need more time and people because of growing scopes, as games embrace massive levels and sprawling (sometimes bloated) open worlds.
And, perhaps most alarmingly, games are growing more expensive because of rampant mismanagement — because of companies chasing trends, making bad bets and lacking a clear creative vision. Inefficient workflows, technological shifts and insecure executives can all be the cause of wasted time, which equates to higher budgets. A common example these days is taking a team with years of experience making single-player games and pivoting them to a multiplayer game as a service.
Everyone who’s worked in the video-game industry for more than a few years has their own horror story. There’s the feature that gets canceled because the CEO’s teenage kid didn’t like it, or the level that everyone knows is going to get axed but that they all have to keep working on because the cancellation hasn’t officially been communicated yet. Or maybe it already has been canceled and nobody told the audio team.
It’s worth noting that video games do need ample iteration to be good, and some of the most successful games have been the result of so-called “wasted” work. Cuts and cancellations are not always a mistake. But there are also countless examples of teams of hundreds floundering in pre-production as they try to figure out what a game’s “core loop” will actually look like. That might seem like welcomed news for workers who get to relax for a while — until crunch time comes along and there’s no more leeway for the game to slip.
So yes, it’s true that the chase for better graphics has contributed to making games more expensive. But if game companies are wondering how their budgets really swelled into nine-figure territory, it might be time for some introspection — and less wasteful management.
Last edited by a moderator: