Official Codex Discord Server

  1. Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.
    Dismiss Notice

Decline why are so many games CPU bound?

Discussion in 'General Gaming' started by cretin, Aug 6, 2020.

  1. cretin Learned

    cretin
    Joined:
    Apr 20, 2019
    Messages:
    469
    Right. I can run W3 on ultra at 1080p at the frame cap of 60fps, but occasionally, some scenes, especially dense forest in early morning or night will drop the game down to 50 or even as low as 45 fps. I have a GTX 1660ti with 6gb of vram, 16gb RAM, and a i7-9750H 6core@2.6ghz. The weakest link here is obviously the CPU, but the question is, why does it fucking matter?
     
    ^ Top  
  2. mk0 Educated

    mk0
    Joined:
    Jun 28, 2020
    Messages:
    113
    [​IMG]

    >complains about CPU performance
    >has a gaming laptop

    :happytrollboy:
     
    ^ Top  
  3. daveyarsegallant Cipher Vatnik

    daveyarsegallant
    Joined:
    Jan 29, 2019
    Messages:
    2,618
    Location:
    Niggeria
    I had been using the same quad core CPU for ages until my entire motherboard burned out. Changing graphics cards allowed me to keep gaming. The bottleneck is the 3D card, not the CPU.
     
    • Agree Agree x 2
    • Informative Informative x 1
    ^ Top  
  4. Citizen Arcane

    Citizen
    Joined:
    Sep 10, 2019
    Messages:
    1,320
    Location:
    iggy bin
    Some shit coded games just still use a single CPU core in 2020, but apart from those most games are usually GPU heavy while pretty light on CPU usage. IDK what games are you playing - Aurora 4x?
     
    • Agree Agree x 1
    ^ Top  
  5. Twiglard Savant

    Twiglard
    Joined:
    Aug 6, 2014
    Messages:
    980
    Location:
    wild pale yonder
    The CPU is a bro can do pretty sophisticated stuff. The GPU is ten thousand deeply retarded illiterate people. Each are good for different purposes. Sometimes the latter has a useful purpose. If you wanted a GPU that could be used as a CPU, the GPUs could no longer afford thousands of threads.
     
    • Informative Informative x 2
    • Funny Funny x 1
    ^ Top  
  6. DalekFlay Arcane Patron

    DalekFlay
    Joined:
    Oct 5, 2010
    Messages:
    12,739
    Location:
    New Vegas
    Why wouldn't it matter? GPU does dominate in PC performance most of the time, but some people mistake this for thinking the CPU doesn't matter. Of course it matters. Especially in open world games like Witcher 3 that have a lot of items and moving parts working at once. Those are also the kinds of games benefiting more from multi-core nowadays too, like Hitman and Assassin's Creed.

    1660ti isn't some amazing card btw either. Gets the job done, but it's not built for ultra settings like you're using. While that's a low CPU clock I wouldn't be surprised if some of the high-end ultra GPU effects aren't hitting the GPU a little hard here and there, in complex areas. The 2060 is more built for 1080p ultra settings.
     
    • Acknowledge this user's Agenda Acknowledge this user's Agenda x 1
    ^ Top  
  7. CyberWhale Arcane

    CyberWhale
    Joined:
    Mar 26, 2013
    Messages:
    3,940
    Location:
    Fortress of Solitude
    1660Ti is good enough, people simply need to stop using autistic Ultra settings and use custom ones instead. Volumetric fog/clouds/lights and reflections can be outrageously taxing without showing easily-noticeable improvements (especially in motion and when actually playing the game).
     
    • Yes Yes x 1
    ^ Top  
  8. DalekFlay Arcane Patron

    DalekFlay
    Joined:
    Oct 5, 2010
    Messages:
    12,739
    Location:
    New Vegas
    Oh it's certainly good enough to run the game, I was saying exactly what you're saying. It's not made for all ultra settings.
     
    ^ Top  
  9. Average Manatee Prestigious Gentleman Arcane

    Average Manatee
    Joined:
    Jan 7, 2012
    Messages:
    10,518
    Its fairly difficult to scale up or down CPU usage in a game. What the CPU is doing actually matters, like tracking the state of the game world and AI decision making. You can't fuck with these things too much, games will break and characters will stop functioning or be retards running into walls, or physics will break and things fall through the floor, or so on. For GPUs its the opposite, you can render in 640x480 or 3840 × 2160 and the same goes for any other graphical effect. Theoretically speaking most games don't need a GPU at all to function, they just need the GPU to show what they are doing on the monitor to you, the dumb meat bag who has to look at a monitor to understand what's going on. But functionally everything the GPU does is cosmetic and therefore its easy to change graphical quality settings without changing the actual way the game functions. This means that games end up designed around a specific bare-bones CPU level (typically whatever consoles can utilize) and don't scale well above that.

    I'm firmly convinced that lots of Ultra settings literally do nothing nowadays and are strictly there to make the people who paid $1000 for a GPU something to burn cycles on.

    Keep in mind that the video is with SMT off. I'm not sure why it does that since it would normally be on. This means that rather than having e.g. 4 cores/8 threads you just have 4 cores/4 threads. I bet with SMT on you could shift all of those results one spot to the right at least, if not two.

    Despite this, they had to pair a >$1000 GPU with a $500 CPU and then cut out 1/3rd of the cores just to start noticing a difference in benchmarking.
     
    Last edited: Aug 9, 2020
    ^ Top  
  10. DalekFlay Arcane Patron

    DalekFlay
    Joined:
    Oct 5, 2010
    Messages:
    12,739
    Location:
    New Vegas
    I think they usually make a visual difference for things like shadows and whatnot. Playing AC: Origins right now and shadows definitely pop-in closer to the camera with very high instead of ultra selected. The problem with ultra settings IMO is that they're never optimized for at all. They're small upgrades, for huge performance penalties.
     
    ^ Top  
  11. Nifft Batuff Arbiter

    Nifft Batuff
    Joined:
    Nov 14, 2018
    Messages:
    1,116
    The bottleneck is the memory bandwidth for both CPU and GPU.
     
    ^ Top  
  12. cretin Learned

    cretin
    Joined:
    Apr 20, 2019
    Messages:
    469
    I turned the shadows down in Witcher 3 from ultra to high and left everything else maxed out. Couldnt tell the visual difference for the life of me, but gained at a minimum some ~10 fps.
     
    • Cheers!! Cheers!! x 1
    ^ Top  
  13. Twiglard Savant

    Twiglard
    Joined:
    Aug 6, 2014
    Messages:
    980
    Location:
    wild pale yonder
    Latency. Bandwidth is fine.
     
    ^ Top  
  14. Viata Arcane

    Viata
    Joined:
    Nov 11, 2014
    Messages:
    6,344
    Location:
    Water Play Catarinense
    I have an old notebook with shitty gpu(and cpu) and I have 0 problem with gaming because I don't play shit modern games. :mixedemotions:
     
    • Salute Salute x 1
    • Old Old x 1
    ^ Top  

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.