LlamaGod said:
Great" graphics always devolve into "poor" graphics as tech improves, but great gameplay always stays great; it saddens us that the majority of game reviewers don't seem to care.[/size][/color]
(Another overlong WBC screed.)
While I suppose the second half of the statement is true (more on that in one second), the first half certainly isn't. Are Super Mario World's graphics any less great now than they were when the game was released? Are Sacrifice's? Are Grim Fandango's? Symphony of the Night's? Freespace II's? To be sure, some technical limitations will always get pushed back but that only means that for graphics that were impressive *tech demonstrations* there will be a fairly short shelf life. For graphics that were impressive as works of art, that issue doesn't arise, or at least going "stale" doesn't do much hurt. Where an artist develops an art form that is not defined by the size of its palette, the density of its resolution, or the number of polygons it's pushing, the art he creates within that form is likely to have enduring quality.
Moreover, there's something presumptuous (and rather silly) about thinking of games as more than ephemera. The hardware on which Atlus's games are played is rendered unavailable in a matter of years, a decade at most; the games themselves scratch, break, whatever. Defending your development choices on the basis that you've maximized the game's longterm value is silly economics, then; it's also a shoddy way to treat your fanbase, insofar as you're basically discounting their interests so as to benefit future, as-of-yet-nonexistant players. Consider this scenario: Developer says that it is making its game graphically "scaleable" such that it will improve as hardware improves. Developer concedes that this results in suboptimal framerates on current systems because the game isn't tailored to current hardware. Nevertheless, Developer justifies the game's poor performance on the basis that "in the future, it will still look good." Surely we would say that was obnoxious.
So, if good graphics now (despite their fading nature) would make for a better game now than good gameplay now, I would say that graphics should receive priority, even if in ten years that means the game won't be as cool.
The point that "great gameplay always stays great" is sort of true. It's true in the same sense that great graphics always stay great (as in the examples I pointed out above, or even older ones, like Contra). But gameplay that *seems* good is often revealed to be lousy with the passage of time. Warcraft I seemed to have great RTS gameplay (to fans of the genre; I don't want to debate the merits of RTS), but that was revealed not to be so once Warcraft II and then Starcraft came out. Warcraft I's gameplay is no longer "great" to anyone familiar with contemporary RTS games. Super Mario Bros. seemed to have "great" gameplay, and indeed the gameplay is still very good. But if you play it today, there are lots of frustrating elements, from the weird jumping mechanics to the relatively homogenous levels. Doom was brilliant and remains solid, but the gameplay is weak compared to what you find in current FPS games (especially on the multiplayer side). If you replicated Doom with today's top-end graphics, I'm sure it would be well-received, but I'm also sure that most people wouldn't find its gameplay "great." I could go through this for standout games in every genre.
It's true that games that were innovative still seem remarkable, but they aren't necessarily as fun to play as they once were.
I'll also add that the games where the gameplay seems to have stood the test of time also tend to be the ones where the graphics still look good. (This may be less true on the RPG side.)
--EDIT--
@ headache:
Just noticed your sig. "[A] doubling in polygon count means a doubling in the amount of time an artist needs to spend generating the model . . . ." I assume you're quoting this for comic effect, since it's not just a little bit wrong, but hilariously wrong.