Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Decline why are so many games CPU bound?

cretin

Arcane
Douchebag!
Joined
Apr 20, 2019
Messages
1,497
Witcher 3 is actually a perfect example of a CPU bound game.

Right. I can run W3 on ultra at 1080p at the frame cap of 60fps, but occasionally, some scenes, especially dense forest in early morning or night will drop the game down to 50 or even as low as 45 fps. I have a GTX 1660ti with 6gb of vram, 16gb RAM, and a i7-9750H 6core@2.6ghz. The weakest link here is obviously the CPU, but the question is, why does it fucking matter?
 

mk0

Learned
Joined
Jun 28, 2020
Messages
113
6hzkC7Y.png


>complains about CPU performance
>has a gaming laptop

:happytrollboy:
 
Vatnik Wumao
Joined
Jan 29, 2019
Messages
15,516
Location
Niggeria
I had been using the same quad core CPU for ages until my entire motherboard burned out. Changing graphics cards allowed me to keep gaming. The bottleneck is the 3D card, not the CPU.
 

Citizen

Guest
Some shit coded games just still use a single CPU core in 2020, but apart from those most games are usually GPU heavy while pretty light on CPU usage. IDK what games are you playing - Aurora 4x?
 

Twiglard

Poland Stronk
Patron
Staff Member
Joined
Aug 6, 2014
Messages
7,509
Location
Poland
Strap Yourselves In Codex Year of the Donut
If the GPU is what makes my computer a SUPPAAAAHHCOMPUTAAAH, why do CUTTING EDGE GRAPHICS games almost always tax the everliving fuck out of my CPU and barely make my GPU break a sweat? Why offload everything onto the weaker slave?

The CPU is a bro can do pretty sophisticated stuff. The GPU is ten thousand deeply retarded illiterate people. Each are good for different purposes. Sometimes the latter has a useful purpose. If you wanted a GPU that could be used as a CPU, the GPUs could no longer afford thousands of threads.
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
Right. I can run W3 on ultra at 1080p at the frame cap of 60fps, but occasionally, some scenes, especially dense forest in early morning or night will drop the game down to 50 or even as low as 45 fps. I have a GTX 1660ti with 6gb of vram, 16gb RAM, and a i7-9750H 6core@2.6ghz. The weakest link here is obviously the CPU, but the question is, why does it fucking matter?

Why wouldn't it matter? GPU does dominate in PC performance most of the time, but some people mistake this for thinking the CPU doesn't matter. Of course it matters. Especially in open world games like Witcher 3 that have a lot of items and moving parts working at once. Those are also the kinds of games benefiting more from multi-core nowadays too, like Hitman and Assassin's Creed.

1660ti isn't some amazing card btw either. Gets the job done, but it's not built for ultra settings like you're using. While that's a low CPU clock I wouldn't be surprised if some of the high-end ultra GPU effects aren't hitting the GPU a little hard here and there, in complex areas. The 2060 is more built for 1080p ultra settings.
 

CyberWhale

Arcane
Glory to Ukraine
Joined
Mar 26, 2013
Messages
6,734
Location
Fortress of Solitude
1660Ti is good enough, people simply need to stop using autistic Ultra settings and use custom ones instead. Volumetric fog/clouds/lights and reflections can be outrageously taxing without showing easily-noticeable improvements (especially in motion and when actually playing the game).
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
1660Ti is good enough, people simply need to stop using autistic Ultra settings and use custom ones instead. Volumetric fog/clouds/lights and reflections can be outrageously taxing without showing easily-noticeable improvements (especially in motion and when actually playing the game).

Oh it's certainly good enough to run the game, I was saying exactly what you're saying. It's not made for all ultra settings.
 
Joined
Jan 7, 2012
Messages
15,254
Its fairly difficult to scale up or down CPU usage in a game. What the CPU is doing actually matters, like tracking the state of the game world and AI decision making. You can't fuck with these things too much, games will break and characters will stop functioning or be retards running into walls, or physics will break and things fall through the floor, or so on. For GPUs its the opposite, you can render in 640x480 or 3840 × 2160 and the same goes for any other graphical effect. Theoretically speaking most games don't need a GPU at all to function, they just need the GPU to show what they are doing on the monitor to you, the dumb meat bag who has to look at a monitor to understand what's going on. But functionally everything the GPU does is cosmetic and therefore its easy to change graphical quality settings without changing the actual way the game functions. This means that games end up designed around a specific bare-bones CPU level (typically whatever consoles can utilize) and don't scale well above that.

1660Ti is good enough, people simply need to stop using autistic Ultra settings and use custom ones instead. Volumetric fog/clouds/lights and reflections can be outrageously taxing without showing easily-noticeable improvements (especially in motion and when actually playing the game).
I'm firmly convinced that lots of Ultra settings literally do nothing nowadays and are strictly there to make the people who paid $1000 for a GPU something to burn cycles on.

Truth be told, some newer games seem to perform better on more cores, so CPUs can indeed become a bottleneck if you have a 4 core processor.
Time to switch to 6 minimum or 8 preferably and you will decrease the chance of having issues, especially in the future.

4-5-year-old mid-range cards are completely sufficient because the original 8th generation consoles are still the base for developing games, the PRO/X series are simply there to push 4k.
Because of the letter, I think many people playing on 1080p won't need an upgrade, especially if they stick with mid settings and no ray-tracing.



The biggest difference seems to be between 4 and 6 (not surprising, since consoles despite having 8 cores have at least one saved for the OS), but even 8 seems to be beneficial in Ubisoft tower exploring games.
Yes, I know, the video doesn't state the architecture or frequency, but still.


Keep in mind that the video is with SMT off. I'm not sure why it does that since it would normally be on. This means that rather than having e.g. 4 cores/8 threads you just have 4 cores/4 threads. I bet with SMT on you could shift all of those results one spot to the right at least, if not two.

Despite this, they had to pair a >$1000 GPU with a $500 CPU and then cut out 1/3rd of the cores just to start noticing a difference in benchmarking.
 
Last edited:

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
I'm firmly convinced that lots of Ultra settings literally do nothing nowadays and are strictly there to make the people who paid $1000 for a GPU something to burn cycles on.

I think they usually make a visual difference for things like shadows and whatnot. Playing AC: Origins right now and shadows definitely pop-in closer to the camera with very high instead of ultra selected. The problem with ultra settings IMO is that they're never optimized for at all. They're small upgrades, for huge performance penalties.
 

cretin

Arcane
Douchebag!
Joined
Apr 20, 2019
Messages
1,497
I'm firmly convinced that lots of Ultra settings literally do nothing nowadays and are strictly there to make the people who paid $1000 for a GPU something to burn cycles on.

I think they usually make a visual difference for things like shadows and whatnot. Playing AC: Origins right now and shadows definitely pop-in closer to the camera with very high instead of ultra selected. The problem with ultra settings IMO is that they're never optimized for at all. They're small upgrades, for huge performance penalties.

I turned the shadows down in Witcher 3 from ultra to high and left everything else maxed out. Couldnt tell the visual difference for the life of me, but gained at a minimum some ~10 fps.
 

Viata

Arcane
Joined
Nov 11, 2014
Messages
9,893
Location
Water Play Catarinense
I have an old notebook with shitty gpu(and cpu) and I have 0 problem with gaming because I don't play shit modern games. :mixedemotions:
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom