Official Codex Discord Server

  1. Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.
    Dismiss Notice

So this is how the Xbox dies...

Discussion in 'General Gaming' started by Bradylama, Jun 11, 2013.

  1. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    [​IMG]

    Gotta pack up boys, we can't provide the hardware before the software is there.

    :retarded:

    AI can be extremely parallel. It's practically one of the biggest examples. Say you play chess against an AI opponent. He has x moves to make. Each branch of moves can be calculated and prioritised in parallel, cutting down in decision making time.

    I wouldn't be so sure.

    Maturity?
     
    • Brofist Brofist x 1
    ^ Top  
  2. MetalCraze Arcane

    MetalCraze
    Joined:
    Jul 3, 2007
    Messages:
    21,104
    Location:
    Urkanistan
    Keep waiting then.

    What does parallel computing or "chess" (which is the simplest form of AI and thus is the worst example) have to do with GPUs not being CPUs?
    Bro you do realize that GPUs cannot perform the absolue majority of tasks that CPUs can? AI being one of them.

    After seeing lineups there's no doubt. Graphics update. Physics moved nowhere. AI is non existent.


    Maturity?
     
    ^ Top  
  3. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    :hmmm:

    Maturity.
     
    ^ Top  
  4. MetalCraze Arcane

    MetalCraze
    Joined:
    Jul 3, 2007
    Messages:
    21,104
    Location:
    Urkanistan
    And here I thought you will totally school me by posting examples of advanced AI running on GPU (cuz CPU is just a dated weakass concept that those PC tards keep holding onto for some reason)

    So hold on to that hope that one day xbone and PieceofShit 4 will "mature" (considering that with each year they will fall behind PC even more)
     
    ^ Top  
  5. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    If only you realised I was talking about parallel processing. Which again, AI can use. The example is quite reasonable and the fact that you think chess AI is simple is laughable. Oh well.

    The HSA capabilities will stay ahead of where the average PC is for a couple of years still. If anything the PC will hold back opportunities in the development industry. That is where the PC has raw power, the consoles have revolutionising architecture that PCs simple cannot compete with until it joins it.
     
    ^ Top  
  6. Infinitron I post news Patron

    Infinitron
    Joined:
    Jan 28, 2011
    Messages:
    82,836
    Grab the Codex by the pussy Dead State Divinity: Original Sin Project: Eternity Torment: Tides of Numenera Wasteland 2 Shadorwun: Hong Kong Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Pathfinder: Kingmaker Pathfinder: Kingmaker
    :lol: By the time developers even learn how to use those "revolutionary architectures" they'll no longer be, well, revolutionary.

    The first generation of games for these consoles will be PS3/X360 games with better textures.
     
    ^ Top  
  7. Average Manatee Prestigious Gentleman Arcane

    Average Manatee
    Joined:
    Jan 7, 2012
    Messages:
    10,233
    Here's the problem: PCs do it in realtime too. We've had compute-powered physics in games for a long time.

    The hardware has existed for (checks wikipedia) over half a decade.

    Chess is entirely different from most game AI. Chess is a game with very simple rules and a very large search space. Most video games instead have a very complex (compared to chess) ruleset with a smaller search space. Indeed, most video game AI doesn't even use the concept of a search space because it would be so radically inefficient.

    7790 is already at most a mid-level card that struggles to keep up with console games ported to PC with some extra detail and resolution added in. Now you want it to handle a true PC-level or *next-gen* game AND do GPGPU functions? Yeah, you just can't handle that.

    Look, IDtenT. It's clear you know nothing systems architecture or program development. You are no different from the *expert journalists* who until 2012 were still repeating facts from their *trusted friends at Microsoft/Sony* about how the 360 and PS3 were still more powerful than most PCs. You're quite literally the console equivalent of a hopeless Apple iTard who masturbates to every Apple commercial they see and actually believes the marketing bullshit. It's really pathetic. I feel embarrassed even responding to you. Please shut up. You are not equipped to take part in this conversation.
     
    • Brofist Brofist x 2
    ^ Top  
  8. Cowboy Moment Arcane

    Cowboy Moment
    Joined:
    Feb 8, 2011
    Messages:
    4,394
    It's like that thing when Morgoth and Xi talk about hardware. Chess is actually a very bad example of how a GPU can help with AI, because it's essentially tree traversal, a task involving a ton of recursion, which modern GPU compute units are very bad at (a compute unit has in the ballpark of 64 kb of local memory), as opposed to a general purpose CPU, which has all the memory it wants available, and is optimized for call/ret efficiency. Chess is possibly managable because of the extremely small gamestate memory footprint (64 Bytes and can be compressed further at the cost of convenience), but, say, AI in a Paradox game would be impossible to run effectively on a GPU.

    What you can do on a GPU are simple, discrete tasks - pathfinding, collision detection, fish floating around randomly and running away when player gets close, etc. So basically, you'll see a lot of AssCreed-like crowds in games if this AI on GPU thing takes off.
     
    ^ Top  
  9. Metro Arcane Beg Auditor

    Metro
    Joined:
    Aug 27, 2009
    Messages:
    26,817
    Oh man, you can get an Nvidia 650Ti for around $160-170ish now. Six months from now I'm sure it'll be closer to $120.
     
    ^ Top  
  10. Malpercio Arcane

    Malpercio
    Joined:
    Dec 8, 2011
    Messages:
    1,414
    Quite honestly, videogame players are so stupid and eager to get fucked in the ass (see simcity, always online, Day one DLC, and every thing Software Houses pulled this gen) that i wouldn't be that surprised if this shit succeeded.
     
    • Brofist Brofist x 2
    ^ Top  
  11. Average Manatee Prestigious Gentleman Arcane

    Average Manatee
    Joined:
    Jan 7, 2012
    Messages:
    10,233
    Radeon 7790 is at ~$130 now. And the CPU on the PS4 is almost dogshit bad, being a low power laptop model.
     
    ^ Top  
  12. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    Really? So GPU and CPU shares same address space already? I did not know that. :roll:

    Nope. HSA hasn't.

    It's options that are available to whatever game logic exists. More options is suddenly a bad thing.

    They're going to stick with 1080 at 60 FPS for now, and will probably move to 30 FPS later. The nice things about consoles is that they can balance the game to whatever the developer's needs might be. Not all games are shooters in any case.

    64 kbytes? Is that each compute unit can address that much memory? Regardless the recursion is simple and probably needs far less memory. Each tree has multiple branches, which makes it ideal if you can do them in parallel. The result are then interpreted by the CPU to see which one is more optimal or whatever. Considering the number of characters in a paradox game and potentially running each one on its own thread, I'm not sure I agree.

    Right. We shall see.
     
    ^ Top  
  13. Cowboy Moment Arcane

    Cowboy Moment
    Joined:
    Feb 8, 2011
    Messages:
    4,394
    Each compute unit (essentially a cluster of stream processors) has around that much local memory, think of it as a cache of sorts. Graphics processing is typically very local - to use a simple example, MSAA only requires that you know the state of a small set of pixels adjacent to the pixel you're processing. Thus, GPUs are optimized for a workflow where a small amount of data is fetched into the local memory, and the compute unit runs on that in parallel. When you need to constantly query global memory (the GDDR5 in the case of a PS4 for example), the whole thing becomes very slow. This is one of the reasons we don't really have proper global lighting models in games, and why ray-tracing doesn't work very well on GPUs. But I digress.

    You can run AI on a GPU the way you're describing efficiently if you can fit the game state info that you need in the local memory. I'm pretty sure that CK2 characters need more info to make their decisions. Also, their logic itself is likely complicated enough to make the floating-point operation focused GPU processors a bad choice in any case.

    We will. Frankly, even if you could do this, I doubt any AAA developers will. Waste of time considering their audience.
     
    ^ Top  
  14. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    The CPU and GPU will share a cache on HSA (probably L3/L2 only, but it's still a huge improvement). If the cache will get too small for the amount of data being generated will be seen.

    The logic can be offloaded onto the CPU once the data has been generated. This can happen organically all the time, with the two sharing the same address space. Basically play to the advantages of each architecture without having to fetch in between. Again the entire point of HSA.

    It's true that the offloading that has happened with AI so far is simple stuff like path-finding (or fixed rule sets, like chess), but that's because it was handled in its entirety by the GPU. HSA give you more options.

    The failures of the system will be addressed in future revisions (obviously not for the consoles), but I honestly don't understand how people like Skyway can reject it off hand without ever seeing it in action. There will obviously be lots of teething problems and future software will try to shoehorn in to the console restrictions - but that's fine.
     
    ^ Top  
  15. Gurkog Erudite

    Gurkog
    Joined:
    Oct 7, 2012
    Messages:
    1,373
    Location:
    The Great Northwest
    Project: Eternity
    I use a 4870 in my computer and it can process modern ports at 2560x1600 with highest resolution textures just fine, but it can't also do AA and/or high detail shadows at with it. I don't use AA at 2560x1600 because it doesn't make a noticeable difference, but the low quality shadows can be irritating.

    A 7790 should not have any problem rendering 1080p with good shadows,some AA, and some physics shit. Unless there is not much difference between the 4870 and 7790.
     
    ^ Top  
  16. Black_Willow Arcane

    Black_Willow
    Joined:
    Dec 21, 2007
    Messages:
    1,865,290
    Location:
    Borderline
    According to this a 4870 should get under 3300 points in 3Dmark11, while 7790 will get about 5700 points. The difference is substantial.
     
    ^ Top  
  17. Dickie Arcane Patron

    Dickie
    Joined:
    Jul 29, 2011
    Messages:
    2,392
    I don't see the big deal with paying to play used games. The only people who lose are GameStop. Fuck those guys. Personally, I love the idea of installing a game and then throwing the disc away. That said, I probably won't buy an XBox One.
     
    • Brofist Brofist x 1
    ^ Top  
  18. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    Huh? The point is a decrease in latency (in some cases massively) between passing data from the CPU to the GPU (and saving memory space in the mean time). This can enable more complex algorithms than we are used to within a pseudo real time tolerance level. Who said anything about doubling speeds? Are you purposefully bringing up a strawman?
     
    ^ Top  
  19. Kirtai Augur

    Kirtai
    Joined:
    Sep 8, 2012
    Messages:
    1,124
    Having read a bit about HSA, is there any real difference between it and what the Amiga did with its custom chips? (Other than using virtual addresses instead of physical ones that is)
     
    ^ Top  
  20. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    It will give a performance benefit to the system, not the GPU. The act of sharing tasks will happen with lesser latency - Fact. Whether the system will use it, is irrelevant. It won't ever decrease performance and it only opens up opportunities for increased performance (and in some cases major improvements.) I do not presume to know exactly which real world cases these might be, but it's easy enough to think up test cases.

    Claiming that an on die GPU is a step backwards is retarded. It's not replacing dedicated GPUs or other slot-in cards. It is not for resource cutting, because today computers have more resources than ever. It's about creating a hydrogenous design for central computation.
     
    ^ Top  
  21. LeJosh Savant

    LeJosh
    Joined:
    Feb 23, 2013
    Messages:
    434
    Location:
    Edinburgh
    Hydrogen powah!
     
    • Brofist Brofist x 1
    ^ Top  
  22. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    :lol: Whoops. *cough* Heterogeneous.
     
    ^ Top  
  23. MetalCraze Arcane

    MetalCraze
    Joined:
    Jul 3, 2007
    Messages:
    21,104
    Location:
    Urkanistan
    Except for some very obvious reason your magical AMD Fusion failed to defeat laws of physics despite what was promised by Sony, M$ and Carmack and runs next gen games like shit and they will have to scale down graphics from all they've shown on PC at E3.

    Case closed.
     
    ^ Top  
  24. IDtenT Contact me for a good time Patron

    IDtenT
    Joined:
    Jan 21, 2012
    Messages:
    10,887
    Location:
    South Africa
    Divinity: Original Sin
    No shit.
     
    ^ Top  
  25. Frian Bargo Kosmonaut's Alt

    Unwanted
    Joined:
    Mar 20, 2012
    Messages:
    282
    Msoft has gone back on the DRM and always online policies. Now the Xbox 720 will function similar to the PS4.Also, renting games will be possible again.

    Sure it's $100 more expensive...but when 2 games cost $120 that doesn't really matter does it.

    Anyhow, we all now when Google's gaming box comes out it will trump all. Not to mention Apples igaming box...
     
    ^ Top