The change in investment attitude, as outlined by Telengard, is a great post.
I think another important part of this discussion is hardware. In the 80s and early to mid 90s, hardware advancements, especially VGA to SVGA+ and increased ram and storage tech, contributed to the fast pace of advancement in games. This advancement was multi-fold. Increased screen color depth and resolution changed how art was developed for games, and changing data storage densities (harddrive, CD) influenced the production values of games. With the adoption of the high data-capacity of CDs, developers started to introduce things like voice-over and FMV. This brought gaming closer to a cinematic experience, whether the current gen knew it was happening or not.
Crucial to my arm-chair analysis is the premise that hardware advances drove game development in the 80s and early-to-mid 90s. This is when the SNES, PSX, and eventually the PS2 and Xbox 1 became gaming sensations. Gaming development driven by hardware advances stopped pushing the envelope and devs started to take advantage of the static console hardware and adopted a low-risk development holding pattern.
Think about it; it's 2000 and you wanted to make a game that will sell well. You do all the monetary analysis that
Telengard discussed. You have the benefit of a well-defined install base of console users. Their hardware isn't changing, so you know exactly what memory, storage and video-rendering capabilities you're dealing with. This console install-base is a huge percentage of the market, with the remainder being highly differentiated with a multitude of hardware configs. There's a huge advantage to developing towards the console (lowest common demonator) as compared to the constantly evolving PC enthusiast community. Flash forward to the Xbox 1 generation, and you can really see the polarization. Xbox 1 and PS2 sales are dwarfing PC gaming. You have a 100% defined hardware platform. You don't have to worry about differing drivers, graphics APIs (direct X, open GL, etc) soundcards, storage-media, or interface (i.e. non-controller) problems. Your market is
completely defined and makes up the plurality of the potential sales.
Innovation stagnates to suite the environment (consoles). This compounded with the higher cost of development makes investors less prone to risk, and more likely to go with the slam dunk. What's a slam dunk? A game with a
defined sales-base, i.e. consoles. I hate to sound like a broken record, but I think the stagnant nature of console hardware has a lot to do with the decline of gaming innovation. Hardware stagnation (i.e. increased specification reliability) combined with a huge marketshare makes the bean-counters push for safer development. Cue CoD 1 - 4, slash-em-up series 1 to 8, etc etc. With a well defined target platform, huge market share and tons of existing sales data to support a +1 development mentality, it's easy to see why there's next to no innovation in the market. Of course I'm talking about games that will/would command any kind of significant market share.