Diablo169
Arcane
One of the main guys from the Dark Souls community rants about all circle-jerking and fallout from E3.
One of the main guys from the Dark Souls community rants about all circle-jerking and fallout from E3.
Yeah, just forget that consoles always put out more performance than their hardware would do on a PC. Continue to be dumbfuck Skyway.
but... BUT CARMACK SAID
Who gives a shit about the troops. Good think MS isn't trying to cosy up to the Kwa's most retarded demographic. (Yes, I realise the irony of such a statement)
J_C said:Yeah, just forget that consoles always put out more performance than their hardware would do on a PC. Continue to be dumbfuck Skyway.
Except when the raw performance is roughly half of what's high-end BEFORE the console is even released, there's some serious limitations on how much optimization can save it. There's a huge disparity, especially compared to the Xbox 360 which had a GPU comparable to the ATI X1800, which was one of the higher-end GPUs around at the time of the console's release.Yeah, just forget that consoles always put out more performance than their hardware would do on a PC.
As I said just one post ago, last gen's consoles actually used fairly high-end components at time of release (The X1800 had only just hit the market in late '05 when the 360 came out). This time, not so much. If there is any advantage upon release, it's going to be gone even faster than last time.Eh. Last generation consoles were pretty much beating PCs in the dust perfornance-wise on release. You needed some seriously tweaked out several thousand dollar PC to get to the same performance at that time. Of course, one year later, that advantage was mostly gone.
Maybe MS and Sony don't care that much this time. If it's true that they don't really lose any money on console sales anymore (not sure here), there isn't much incentive to wait for an amortization of console costs until they make adjustments to the technical specs of newly released consoles. Of course, that only works in a limited way, as they don't want to give up the "one fixed environment" advantage for developers.As I said just one post ago, last gen's consoles actually used fairly high-end components at time of release. This time, not so much. If there is any advantage upon release, it's going to be gone even faster than last time.Eh. Last generation consoles were pretty much beating PCs in the dust perfornance-wise on release. You needed some seriously tweaked out several thousand dollar PC to get to the same performance at that time. Of course, one year later, that advantage was mostly gone.
If you were to buy a PC for the price of a console in 2005, you definitely would not be able to run the games that they run, on the settings that they run on. The uniformity of hardware plays a huge part in this and allows developers to optimize a great deal.
Seriously you don't need to one-up consoles on everything.
Yeah, but take a look at what graphics can they achieve with a shitty 7 years old hardware now. While it falls short of a PC exclusive graphics whore game, a Forza 4 or Uncharted 3 looks pretty good for running on a shitty hardware. Now I assume with 10 times more RAM and 10 times faster GPU, they could pull off some pretty nice stuff.Except when the raw performance is roughly half of what's high-end BEFORE the console is even released, there's some serious limitations on how much optimization can save it. There's a huge disparity, especially compared to the Xbox 360 which had a GPU comparable to the ATI X1800, which was one of the higher-end GPUs around at the time of the console's release.Yeah, just forget that consoles always put out more performance than their hardware would do on a PC.
Expect console limitations to hold games back even sooner on this gen than they did on the last. Oh well, at least I won't need to replace my next laptop anytime soon.
It took a huge investment to get to this point. They were in the red for more than of the xbox division's life.They had a combined operating profit of a billion dollars in FY 2009 and 2010. They wouldn't be in this market for 10 years if they still had to subsidise it.
If you were to buy a PC for the price of a console in 2005, you definitely would not be able to run the games that they run, on the settings that they run on. The uniformity of hardware plays a huge part in this and allows developers to optimize a great deal.
Seriously you don't need to one-up consoles on everything.
Herp derp, I don't think anyone here has said a ~$500 USD PC will outperform the $500 USD Xbone. What IS being said is that the Xbone isn't future-proof, it's barely present-proof, and that it won't take long at all for its hardware limitations to become a seriously limiting factor. Yes, optimization clearly helps, however certain people are overestimating how much it can do to make up for hardware that's old BEFORE the console even comes out. The GPU closest to the one in the 360 would have set you back $500 USD or more back in '05. The Xbone's today? Try ~$140 USD.
Yeah, but take a look at what graphics can they achieve with a shitty 7 years old hardware now. While it falls short of a PC exclusive graphics whore game, a Forza 4 or Uncharted 3 looks pretty good for running on a shitty hardware. Now I assume with 10 times more RAM and 10 times faster GPU, they could pull off some pretty nice stuff.Except when the raw performance is roughly half of what's high-end BEFORE the console is even released, there's some serious limitations on how much optimization can save it. There's a huge disparity, especially compared to the Xbox 360 which had a GPU comparable to the ATI X1800, which was one of the higher-end GPUs around at the time of the console's release.Yeah, just forget that consoles always put out more performance than their hardware would do on a PC.
Expect console limitations to hold games back even sooner on this gen than they did on the last. Oh well, at least I won't need to replace my next laptop anytime soon.
The best looking console games are essentially running at 720p on medium settings and at 30 fps. A mid-range PC can handle 1080p, high settings at ~45 FPS, while a top end machine can handle 1440p high settings and get 60 FPS.Yeah, but take a look at what graphics can they achieve with a shitty 7 years old hardware now. While it falls short of a PC exclusive graphics whore game, a Forza 4 or Uncharted 3 looks pretty good for running on a shitty hardware. Now I assume with 10 times more RAM and 10 times faster GPU, they could pull off some pretty nice stuff.
No shit. However, guaranteeing a stagnant future is a rather twisted meaning for 'future-proof'.Yeah, but the "present" is defined as what the consoles can pump out. All of that spare power in our rigs won't mean shit, when all the major devs will be using the consoles as the baseline. There won't be any future tech.What IS being said is that the Xbone isn't future-proof, it's barely present-proof, and that it won't take long at all for its hardware limitations to become a seriously limiting factor.
IIRC, it featured fancy tech that was beyond what PCs were getting (and actually never got). The ATI GPU had some kind of fancy buffer that was supposed to be able to do 4x anti-aliasing for free, but then it turned out you couldn't do some post-processing tricks so most games didn't use it.Just to give a bit of perspective: The 360 had a GPU comparable to one that cost ~$500 USD at the time of its release. If the Xbox One's card were comparable to something at the equivalent current pricepoint (actually a bit cheaper), it would be the third fastest GPU on that list. Instead its hardware is way down there at the bottom.
Denying that this will have any impact seems rather questionable.
So fucking what? There are no games on earth, in which you can't fing a graphical bug, or record a frame, where the engine messes up.Yeah, but take a look at what graphics can they achieve with a shitty 7 years old hardware now. While it falls short of a PC exclusive graphics whore game, a Forza 4 or Uncharted 3 looks pretty good for running on a shitty hardware. Now I assume with 10 times more RAM and 10 times faster GPU, they could pull off some pretty nice stuff.Except when the raw performance is roughly half of what's high-end BEFORE the console is even released, there's some serious limitations on how much optimization can save it. There's a huge disparity, especially compared to the Xbox 360 which had a GPU comparable to the ATI X1800, which was one of the higher-end GPUs around at the time of the console's release.Yeah, just forget that consoles always put out more performance than their hardware would do on a PC.
Expect console limitations to hold games back even sooner on this gen than they did on the last. Oh well, at least I won't need to replace my next laptop anytime soon.