Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

What is even the point of 4k textures?

JarlFrank

I like Thief THIS much
Patron
Joined
Jan 4, 2007
Messages
33,048
Location
KA.DINGIR.RA.KI
Steve gets a Kidney but I don't even get a tag.
Does anyone even notice the difference? Is it just masturbation over how good your PC is so you can brag about it? "Oh yeaaaah man look at how awesome my rig is it can run this game with 4K HD textures oaaaaaah!!"

Cause I don't see a difference, and I don't see the point. I actually hate overblown texture sizes as they just increase the install size of a game. Your game would usually be 20 gigs in size, but you added 4k textures and now it's 100 gigs? Fuck right off with that shit. A barely perceptible increase in texture quality is NOT worth 80 gigs of space!

Here's a comparison screenshot from two HD texture mods for Fallout 4. One is standard HD textures, and the other is super ultra 4k textures. Can you spot the difference?

maxresdefault.jpg


I can't spot a difference. And even if I could, the difference would only be noticeable if you spend a lot of time looking at the textures very closely. Most of the time I don't look at textures very closely, I look at the environment around me, I look into the distance to see faraway dungeons I wanna explore, I frantically strafe and bunnyhop while shooting at enemies. During normal gameplay, I never come across a situation where I'd stop and marvel at how high res the textures are. In fact, texture resolution is one of the least important graphical elements of a game - stuff like LoD is much more important, for example.

At some point, you just get diminishing returns when it comes to texture sizes.
HD quality textures are already peak. They look great and smooth and even if you pinch your eyes and move your face 1 cm in front of the screen you won't be able to count the individual pixels. Increasing the texture quality even further doesn't do anything - the change isn't noticable to the human eye, and if it is, it is only barely so. Definitely not to a degree that makes the increase in hard drive space and RAM requirements worth it.

Have some more comparison shots and tell me if you would actually notice the difference during gameplay. This is a HD texture mod for Borderlands 2, a fast-paced looter shooter where you spend most of your time running and shooting. The game's pace rarely allows you to stop and smell the flowers:
maxresdefault.jpg

If I look closely here, I do notice a minor difference.
But would I notice that difference during gameplay? No, I'd be too busy running and jumping and shooting to notice.

Here's a comparison shot from a Far Cry Primal HD texture mod:
maxresdefault.jpg


WHAT IS THE DIFFERENCE??? I DON'T NOTICE A FUCKING DIFFERENCE!!

Kingdom Come Deliverance:

maxresdefault.jpg


NO SERIOUSLY THIS IS LITERALLY THE SAME PICTURE THERE IS NO DIFFERENCE!!

maxresdefault.jpg


NO THEY DON'T!! THEY FUCKING DON'T, BOTH SCREENSHOTS LOOK EXACTLY THE SAME, AAAAAAAAAAAAAAAAA
 
Joined
Mar 18, 2009
Messages
7,304
I can see a tiny bit of difference when looking closely, not enough to justify additional tens of gigs of SSD space taken. I guess if you have a ton of free space that you don't need for anything else you might as well install them, otherwise - meh. Certainly won't notice it during gameplay, only if you stand there and stare at textures like a doofus.
 

LESS T_T

Arcane
Joined
Oct 5, 2012
Messages
13,582
Codex 2014
It's not like I'm advocating 4K textures or something (I hope that memory and processing power better be used in more important things) but you're comparing 4K textures in 720p images. I think 4K textures are supposedly for 4K monitors. I don't know because I don't have one. Also it's probably not much extra work because artists work on those textures in super high-resolution anyway.

So as long as they're provided as optional installation like KCD did, I don't care.
 

Burning Bridges

Enviado de meu SM-G3502T usando Tapatalk
Joined
Apr 21, 2006
Messages
27,562
Location
Tampon Bay
Ok you have understood absolutely nothing about 3d graphics, did you.

To answer your goddamn question first, in general a higher texture resolution allows you to go twice as close to an object before it gets pixelated or blurry.

Second, quality settings are not there to give you, the noob, a massive hardon by turning everything to ultra. The whole point of quality is to be able to find the setting where you achieve the best picture at the lowest computational cost.

If you enable 4K textures and you see no difference just turn them back. No one knows the reason, either the game engine or your resolution is shit, or there are too many goddamn shaders. But don't make a silly thread on rpgcodex where you try to generalize this problem.
 
Last edited:

passerby

Arcane
Joined
Nov 16, 2016
Messages
2,788
If the resolution of the texture is higher than the resolution of the part of the renderd picture it'll occupy than obviously there won't be any difference.
If it's already perfectly sharp then there is no point in increasing resolution further.

Of the screenshots you've posted, the KCD one is slightly sharper, or on the F4 one the ground is slightly sharper and detailed, while the gun is too small in relation to the texture to make a difference.
Still, the difference is too subtle to be noticable during gameplay, unless you stop playing and intensely stare at stuff closeup.
 
Last edited:

Moaning_Clock

SmokeSomeFrogs
Developer
Joined
Feb 7, 2021
Messages
655
"We barely innovate in games so we need better graphics." AAA in a nutshell.

I can't see any difference except for Kingdom Come Deliverance but even there barely. The beard a tad sharper?

Texture Res almost always only a problem with older games but if it looks fine, who cares. On the other hand, maybe I see a difference in a couple of years when most people are on 4K-8K monitors.
 

JarlFrank

I like Thief THIS much
Patron
Joined
Jan 4, 2007
Messages
33,048
Location
KA.DINGIR.RA.KI
Steve gets a Kidney but I don't even get a tag.
Ok you have understood absolutely nothing about 3d graphics, did you.

To answer your goddamn question first, in general a higher texture resolution allows you to go twice as close to an object before it gets pixelated or blurry.

Second, quality settings are not there to give you, the noob, a massive hardon by turning everything to ultra. The whole point of quality is to be able to find the setting where you achieve the best picture at the lowest computational cost.

If you enable 4K textures and you see no difference just turn them back. No one knows the reason, either the game engine or your resolution is shit, or there are too many goddamn shaders. But don't make a silly thread on rpgcodex where you try to generalize this problem.

So a little bit of sharpness is worth tens of gigabytes of space for you?

I understand a lot about 3D graphics. I understand that this looks awesome even with low resolution:
ad_v1_50final.jpg


HD wankery is overrated.
 

JarlFrank

I like Thief THIS much
Patron
Joined
Jan 4, 2007
Messages
33,048
Location
KA.DINGIR.RA.KI
Steve gets a Kidney but I don't even get a tag.
My point is that normal HD is already super detailed. Anything beyond that is pointless graphics whoring to the detriment of the game - like having to put up with gargantuan install sizes just for a minor increase in detail that you're not going to notice during normal gameplay anyway, unless you for some reason focus on staring at textures a lot.

I still play 20 year old games with textures that don't go past 128x128, and they look perfectly fucking fine. They also don't take hundreds of gigs of hard disk space.
 
Joined
Sep 25, 2018
Messages
419
Location
Pink Pony Planet
Well higher res textures may be needed for higher res displays, but... honestly, I think most of the graphical enhancements of the last few years are pretty much overkill that is barely noticeable in actual gameplay process (sure may look pretty on a static screenshot but who cares?). IMO difference in quality between an HD and 4k resolution is barely noticeable on regular 27-32 inch displays, and certainly doesn't warrant a massive performance loss (realistically 4k runs at 30 fps on anything but a super high-end machine, so what's the point of going back to supbar 30 fps gaming experience in exchange for minor quality improvement?)

What I'm trying to say is that I'll take performance/smoother frame time/no input lag over usless graphical enhancements any day
 

Verylittlefishes

Sacro Bosco
Patron
Joined
Sep 14, 2019
Messages
4,731
Location
Oneoropolis
the only point is to make your PC lag to sell you a new one
also modern devs are REALLY lazy with optmization, some 2014 games still show bad performance on 2020 hardware

the only game I can recall where visuals change dramatically on ultras is Talos Principle, but it's a fucking puzzle game.
 
Joined
Sep 25, 2018
Messages
419
Location
Pink Pony Planet
RTX is obviously incline because in future it may lead to realtime physically accurate rendering (If you ever played with 3d software you know how cool it can look like). But the thing is, current year RTX FUCKING SUCKS. It's more of a proof-of-concept implementation that barely makes a difference adding some reflections instead of completely simulating all the lights in a scene (that's how 3d software raytracing renderers work, but they can render a single frame for minutes to get good results), using cheap tricks like only applying to select areas of the scene, using AI denoiser fucking up the details, rendering reflections in lower resolution and requires you to blur the shit out of your game with DLSS to get close to playable framerate.

Maybe in 10 years when videocards are 10 times more powerful we would see actual realtime raytracing rendering in games, but right now it's trash.
 

Lyric Suite

Converting to Islam
Joined
Mar 23, 2006
Messages
56,157
You can't see 4k textures because you need the pixel density of a 4k screen to show any difference.

Not that it matters because modern games all look like blurry garbage to me, even on ultra. It's very rare that a game is able to make all the post-processing stuff glue toghether in a way where the pixture doesn't look like some vaselline smeared shit.
 

Lyric Suite

Converting to Islam
Joined
Mar 23, 2006
Messages
56,157
RTX is obviously incline because in future it may lead to realtime physically accurate rendering (If you ever played with 3d software you know how cool it can look like). But the thing is, current year RTX FUCKING SUCKS. It's more of a proof-of-concept implementation that barely makes a difference adding some reflections instead of completely simulating all the lights in a scene (that's how 3d software raytracing renderers work, but they can render a single frame for minutes to get good results), using cheap tricks like only applying to select areas of the scene, using AI denoiser fucking up the details, rendering reflections in lower resolution and requires you to blur the shit out of your game with DLSS to get close to playable framerate.

Maybe in 10 years when videocards are 10 times more powerful we would see actual realtime raytracing rendering in games, but right now it's trash.

RTX is decline because they didn't solve the problem of its massive performance hit and what is going to happen from now on is that devs aren't going to bother using conventional techniques to give us all the same effects 'cause all they have to do is slap RTX effects on and fuck you if you don't have the hardware to run it.

Control is a primary example. There is NO reason for the game to look so bare with RTX off other than the fact they knew the game was going to be used as a show case for RTX effects so they didn't bother giving us a complete RTX off alternative. You look at a game like Metrox Exodus and you'll be hard pressed to see how RTX is worth it when the picture still looks "complete" even without it. Sure RTX may look a little better but not enough to justify the massive performance hit.

Well, i have an hunch that as time goes by, games are going to take the Control route instead, giving you half-assed incomplete visuals and forcing you to rely on RTX to have a "finished" look to the game. This will destroy high frame rate gaming but it's not like anybody is going to give a shit as 30 FPS is going to be the new normal when RTX moves to consoles.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom