Exactly. I didn't want to get into details about development costs in my previous post because i am frankly not that autistic, but anyone with a brain and some knowledge about software development knows this "rising development costs" excuse is a load of bullshit. I would argue that today in some ways it is even cheaper to develop games than 20 years ago. For many reasons.
While this might be the case for smaller indie games (depends on the game), this is certainly not the case when it comes to "AAA" games with state of the art graphics (which is where the largest part of the development budget goes).
People confuse better graphics with more labor from the developer. This is stupid. If you are an artist creating a texture, it doesn't matter what resolution the texture is in the game, it is still one texture.
Textures are made in a variety of ways, often more resolution means you need to bring in more detail that needs to be sourced. For a low resolution texture you can use your cheap ass mobile phone camera to source materials, but for a higher resolution texture this wont fly - you need more advanced tools and you may need to purchase a texture library (which you'll use for sourcing, not as a final texture).
If you made a texture in the PS1 era, and one in the PS5 era, all that changed is that modern hardware is more powerful than it was back then, so it can handle your texture designs without you having to downscale them through a filter first. That's about it.
Far from it. Textures made during the PS1 era were made in a much different way than PS5 era. During the PS1 era and up until around early 2000s, textures were a
base texture drawn or sourced from a photo with some 2D shading effects applied over them. Nowadays you can't do that, at the minimum you need to remove any lighting (which comes in direct contrast with the workflows during the 90s/early 2000s where you *wanted* lighting as part of the texture), generate normal maps and roughtness maps - these add extra work for the artists and if you want to do it properly often you actually need to create a 3D mesh over the de-lit diffuse/albedo texture so that you get the normal maps correctly (back in mid/late 2000s many artists would try to generate normal maps and specular maps from from the diffuse map by treating it as a height map and generating normals from there, but this produces flat and often wrong results and is rarely used nowadays outside of base materials that are to be used for adding fine details - manually - to an existing texture). By the way this is just the minimum - artists often have to create additional textures, like ambient occlusion textures for better lighting and shadows. These are done with dedicated tools that need their own time to work with.
Moreover the way textures are being made nowadays is different, at the past artists would work with tools like photoshop but nowadays they use tools like substance painter that take into consideration the full set of textures (albedo, normal, etc) as well as surface and material properties (that doesn't completely invalidate photoshop use, but it is minimized to specific uses - artists need to know both to be good at their job).
For models the same rule applies.
No, there are *way* different approaches when it comes to the very low poly (PS1 era) graphics vs the high poly (PS5 era) graphics, especially since the introduction of digital sculpting around late 2000s which added a whole new set of steps for making models. The only common aspect between these two are the part where you push vertices around, nothing else is the same. Even the part where you create the mesh has a different way of thinking - at the past a lot of the models' features were put into the texture and you had to pretty much know how the final model with the textures applied to it will look like while making the texture, whereas nowadays almost everything, down the eyeballs, is done on the model as geometry and textures are used to add the very fine details, act as bake targets (e.g. for ambient occlusion maps), etc that didn't even exist as a concept back in PS1 days.
Shaders are even less work most of the time. Most shaders are reusable between projects. You don't need a gazillion of water shaders for each generation of consoles. Yes each gen sees better quality shaders because the hardware gets more powerful so the developers get more bold in their algorithms without fear of leading to a slide show.
Shaders are almost never made by artists, though they do take more time to create than they did back when they were first introduced and often there are dedicated programmers for some of them. However the difference in work for shader use in modern games vs. back when they were introduced in early 2000s is dwarfed by the difference in work for art.
Note that *materials* might be made by artists and there is even a specialized subcategory of artists called "technical artists" that work with those. This subcategory didn't really exist until late 2000s, let alone PS1 days - and guess what, it is a subcategory exactly because there are multiple specialists working on it.
End so on and so on, the point is, improved graphics don't mean more work/hours.
This is absolutely and completely wrong, nowadays "AAA" games use way more art assets than they ever did and those assets take time to be made and they are made by increasingly larger teams of artists. Back in late 90s, a state of the art game like Quake would use
a few repeating textures slapped all over the place but this will never happen today since artists will try to hide (=time) any repeating textures by creating many more of them (=time) as well as relying on environment artists to create models to both spice up the environments (=time) and cover the repeating. In fact, level design in the 90s was largely the work of a single person doing the layout, placement, lighting, 3D art, etc for a map whereas nowadays there are dedicated jobs (ie. more people, meaning more money are spent) for level design (layout), lighting, environment art (often with separate artists working on reusable texture art and 3D assets), etc as well as people whose entire job is to keep everything consistent now that there are way more people involved in the process.
It means more powerful hardware to run the game, essentially.
Yes, it also means that but this doesn't affect the number of people needed to make a game, all it affects is the ceiling of what a game can do.
Yes in 2 decades we are using more stuff, for example more textures for more variety, but we are also re-using more stuff between projects, so it evens out.
No, very few things that actually affect the number of people who'd work on a game are shared between projects as often the time between these projects is large enough for the production and quality standards to have increased to the point where you can't use the vast majority of the existing assets.
For example New Vegas reused almost everything from Fallout 3, so even if let's say Fallout 3 was expensive to make, the fact that the assets were used for 2 games + DLC more than made up for the cost....
New Vegas reusing assets from Fallout 3 is something that rarely happens in the vast majority of "AAA" games - and New Vegas still had about the same number of artists working on it as Fallout 3 did. Though using Bethesda as some sort of indicator for team sizes (or development practices in general) in AAA games is misleading, their team size of 100 up until they worked on Skyrim was considered an anomaly for a AAA budget game - and that was 8 years ago.
I think the video game industry will crash. This is no joke.
This will not happen, there are way too many developers around and located all over the world for any "crash" to happen - even the supposedly "video game crash" of the 80s (which happened in a *significantly* smaller industry) only affected consoles and was limited to the US while computer games and other countries were not affected.
At most some big publishers will decide to stop making big expensive games (e.g. what Konami decided to do, but to a larger scale) and some developers will close shop (something that already happens with less profitable developers anyway). A natural ceiling might be hit, but nothing that will affect the entire industry at large.
Now don't take all this as me agreeing with the increasing game prices or anything like that or even that i agree or like those huge development costs (IMO the sweet spot is around the level/size of Spiders or Piranha Bytes, though i'm not sure how much crunch they have to endure to make their games as good looking as they are), but when it comes to "AAA" games with state of the art graphics (ie. what all big publishers and developers are striving for) there is absolutely an increase in development costs driven largely by the need for more and more detailed assets.
I mean,
check the credits for The Last of Us Part II as an example of a recent "AAA" game (i guess there might a spoiler at the first minute or so, i didn't played the game nor listened to the video with audio so i don't know, skip to ~1:40 to be sure). By far the most roles were about the game's assets - artists, animators, level designers, voice actors, etc, even the outsourced work was primarily about assets. All these are people who needed to be paid over the entire game's development (outside the outsourcing, but those had a lot of cost as well, it isn't like they were free) and the more people means the higher the cost.