Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Starfield Pre-Release Thread [GAME RELEASED, GO TO NEW THREAD]

Joined
Feb 19, 2021
Messages
560

eg7SwN1.png


:timcain:
 
Joined
Aug 5, 2009
Messages
3,749
Location
Moo?
Was there any benefit of fully building a settlement in FO4? Or its just pure autism?
Pure autism. In the Fallout 4 documentary they said it was a last minute addition to game which explains why it felt so tacked on. Said the mode was created by a guy at some Game Jam event.

They did the same thing with companions in Fallout 3, and it borked the logic of the endgame sacrifice. Wonder what will end up being the last minute addition this time?
 

Stavrophore

Most trustworthy slavic man
Patron
Vatnik
Joined
Aug 17, 2016
Messages
15,135
Location
don't identify with EU-NPC land
Strap Yourselves In
For those retards that think I'm wrong about texture size=/=texture resolution.

Standard Texture Sizes​


Most graphics hardware requires that your texture images always be a size that is a power of two in each dimension. That means you can use any of the following choices for a texture size: 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048, or so on (but unless you have a really high-end card, you’ll probably need to stop there).

The textures don’t usually have to be square: they don’t have to have the same size in both dimensions. But each dimension does usually have to be a power of two. So 64 × 128 is all right, for instance, or 512 × 32, or 256 × 256. But you can’t make a texture image that is 200 × 200 pixels, since 200 isn’t a power of two.

By default, Panda3D will automatically rescale any texture image down to the nearest smaller power of two when you read it from disk, so you usually don’t have to think about this–but your application will load faster if you scale your textures properly in the first place.

If you would like Panda3D to rescale your images up to the next larger power of two instead of down to the next smaller power of two, use:

https://docs.panda3d.org/1.10/python/programming/texturing/choosing-a-texture-size

https://gamedev.stackexchange.com/questions/64106/best-practices-of-texture-size

Texture size is the resolution of the texture and is in a power of 2.

Generally speaking you are right, some textures also have alpha channel and that doubles their size they occupy on hard drive, and then you add mip maps and that increase size too. But yes, higher resolution textures have bigger size in megabytes.
 

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,322
Location
In the ether
Strap Yourselves In Codex Year of the Donut
Texture resolution and screen resolution are completely independent. Texture resolution is controlled by your ingame texture settings not the resolution you play at.
Just google does texture resolution depend on screen resolution and you'll find 100s of threads that will explain that. I linked the DLSS docs, please show where it talks about changing the texture resolution.

You are not a learning animal. I said that they aren't linked, except for in the case of actual generation of the screen resolution that is then upscaled. You are now creating a strawman to argue with.

If you can actually point to where I said that texture resolution is the same as screen resolution let me know. Otherwise shut the fuck up and sit down. I'm tired of retards like you that run your mouth without actually listening.

Do you even know how graphic cards fucking work? Of course not because you're a fucking retard and can't argue with what I actually said ya stupid git.
I said that they aren't linked, except for in the case of actual generation of the screen resolution that is then upscaled.
Then link some docs cause I haven't read anywhere of fsr/dlss doing that.

How does frame generation work?

It takes the objects in the scene, applies the materials i.e. textures of a specified size, and outputs it all through the GPU. FSR/DLSS use lower resolution textures to improve the performance by slashing the image resolution. It has slashed the texture resolution as a byproduct then upscales it.

FSR, like all upscaling solutions, lowers the render resolution of the game to significantly improve performance then upscales the lower resolution input back to your target resolution, using its cutting-edge algorithm to improve the super resolution output to up to near-native resolution image quality (of note, FSR does require developer integration into a game to work).

SR QUALITY MODESCALE FACTORINPUT RESOLUTION FOR 1440P FSRINPUT RESOLUTION FOR 4K FSR
“Ultra Quality”1.3X per dimension1970 x 11082954 x 1662
“Quality”1.5X per dimension1706 x 9602560 x 1440
“Balanced”1.7X per dimension1506 x 8472259 x 1270
“Performance”2.0X per dimension1280 x 7201920 x 1080

So what I said is correct and what you said is correct. To make an 1280 x 720 image you need to use lower resolution textures on the objects. That's why everything is jagged and needs to be smoothed. Now you can go on the retard list.
 

Stavrophore

Most trustworthy slavic man
Patron
Vatnik
Joined
Aug 17, 2016
Messages
15,135
Location
don't identify with EU-NPC land
Strap Yourselves In
I keep telling people that modern developers are so retarded that they have to rely upon early 2000s texture sizes to get decent performance by forcing upscaling. Think about that for a second. They can't get the performance from a modern GPU unless they use texture sizes from the early 2000s when early 3D cards were much weaker but still had superior performance.
You mean resolution my dude, not texture sizes.
No, FSR and DLSS both require texture sizes that are roughly what we had back in the 2000s and are upscaled to the proper resolution. In order words, the textures are like 240x240 and upscaled to 1080p and higher. Combine that with the use of fake frames they need to get actual performance.

Texture sizes are based upon the resolution of the texture by the way. ;)

You are wrong here, textures are not smaller using DLSS or FSR. Game just renders at lower resolution, but textures stay the same, it's a propensity of a mesh/model paths and uv mapping which stay the same. It wouldn't make slightest sense to feed DLSS/FSR low res textures, especially since the geometry of the game stay the same.

Have you ever 3d modelled something for a game? Did texture work? You would know that what you are saying make no sense.
 

Takamori

Learned
Joined
Apr 17, 2020
Messages
928
After reading a little, I decided to watch the mines explodefrom a distance. Enjoy mine canaries hope your flight is enjoyable :hero:
 

Senntinel

Novice
Joined
Sep 19, 2014
Messages
11

How does frame generation work?

It takes the objects in the scene, applies the materials i.e. textures of a specified size, and outputs it all through the GPU. FSR/DLSS use lower resolution textures to improve the performance by slashing the image resolution. It has slashed the texture resolution as a byproduct then upscales it.

FSR, like all upscaling solutions, lowers the render resolution of the game to significantly improve performance then upscales the lower resolution input back to your target resolution, using its cutting-edge algorithm to improve the super resolution output to up to near-native resolution image quality (of note, FSR does require developer integration into a game to work).

SR QUALITY MODESCALE FACTORINPUT RESOLUTION FOR 1440P FSRINPUT RESOLUTION FOR 4K FSR
“Ultra Quality”1.3X per dimension1970 x 11082954 x 1662
“Quality”1.5X per dimension1706 x 9602560 x 1440
“Balanced”1.7X per dimension1506 x 8472259 x 1270
“Performance”2.0X per dimension1280 x 7201920 x 1080

So what I said is correct and what you said is correct. To make an 1280 x 720 image you need to use lower resolution textures on the objects. That's why everything is jagged and needs to be smoothed. Now you can go on the retard list.
I give up, he can't be saved.....
Maybe some guys with the dev tag can enlighten him.
 
Last edited:

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,322
Location
In the ether
Strap Yourselves In Codex Year of the Donut
I keep telling people that modern developers are so retarded that they have to rely upon early 2000s texture sizes to get decent performance by forcing upscaling. Think about that for a second. They can't get the performance from a modern GPU unless they use texture sizes from the early 2000s when early 3D cards were much weaker but still had superior performance.
You mean resolution my dude, not texture sizes.
No, FSR and DLSS both require texture sizes that are roughly what we had back in the 2000s and are upscaled to the proper resolution. In order words, the textures are like 240x240 and upscaled to 1080p and higher. Combine that with the use of fake frames they need to get actual performance.

Texture sizes are based upon the resolution of the texture by the way. ;)

You are wrong here, textures are not smaller using DLSS or FSR. Game just renders at lower resolution, but textures stay the same, it's a propensity of a mesh/model paths and uv mapping which stay the same. It wouldn't make slightest sense to feed DLSS/FSR low res textures, especially since the geometry of the game stay the same.

Have you ever 3d modelled something for a game? Did texture work? You would know that what you are saying make no sense.

Incorrect, the game renders the scene at the specified resolution using textures designed for that resolution then is upscaled. That's what I've been saying.

The entire thing has been that way since the beginning of graphics and there is no fucking way that they're going to change how its done for rendering a frame. It's nearly 70 years worth of processes that is built into everything that is used today. Most games that are 1280x720 i.e. early 2000s only had texture resolutions of 512x512 if you were lucky. Modern games use anything between 1,024 to 3,884 texture resolutions to make the frames.

Lower texture resolution =/= more jaggies on the edges of objects. Higher texture resolution =/= less jaggies on the edges of objects.

That's why anti-aliasing was created to deal with the jaggies on low resolution textures and is still used today.
 

Darkwind

Augur
Patron
Joined
Aug 1, 2019
Messages
629
Strap Yourselves In Codex Year of the Donut Codex+ Now Streaming! Enjoy the Revolution! Another revolution around the sun that is.
By the way, I didn't see a single white NPC. Pure Negros. Is this future earth?

All of their pre-release advertisement is 90% negroid and homo/lesbian. Most NPC's appear to be negro or mutt. Yes, that is what Bethesda envisions as the future. If that were true, then I don't think we would be in outer space.

If you are observant of current year trends, I would say that is 100% accurate. We are heading quickly towards:

Capture.4JPG.JPG
 

Stavrophore

Most trustworthy slavic man
Patron
Vatnik
Joined
Aug 17, 2016
Messages
15,135
Location
don't identify with EU-NPC land
Strap Yourselves In
I keep telling people that modern developers are so retarded that they have to rely upon early 2000s texture sizes to get decent performance by forcing upscaling. Think about that for a second. They can't get the performance from a modern GPU unless they use texture sizes from the early 2000s when early 3D cards were much weaker but still had superior performance.
You mean resolution my dude, not texture sizes.
No, FSR and DLSS both require texture sizes that are roughly what we had back in the 2000s and are upscaled to the proper resolution. In order words, the textures are like 240x240 and upscaled to 1080p and higher. Combine that with the use of fake frames they need to get actual performance.

Texture sizes are based upon the resolution of the texture by the way. ;)

You are wrong here, textures are not smaller using DLSS or FSR. Game just renders at lower resolution, but textures stay the same, it's a propensity of a mesh/model paths and uv mapping which stay the same. It wouldn't make slightest sense to feed DLSS/FSR low res textures, especially since the geometry of the game stay the same.

Have you ever 3d modelled something for a game? Did texture work? You would know that what you are saying make no sense.

Incorrect, the game renders the scene at the specified resolution using textures designed for that resolution then is upscaled. That's what I've been saying.

The entire thing has been that way since the beginning of graphics and there is no fucking way that they're going to change how its done for rendering a frame. It's nearly 70 years worth of processes that is built into everything that is used today. Most games that are 1280x720 i.e. early 2000s only had texture resolutions of 512x512 if you were lucky. Modern games use anything between 1,024 to 3,884 texture resolutions to make the frames.

Lower texture resolution =/= more jaggies on the edges of objects. Higher texture resolution =/= less jaggies on the edges of objects.

That's why anti-aliasing was created to deal with the jaggies on low resolution textures and is still used today.

I dont know how did you tied antialiasing to texture resolution but hats off. MSAA in older games worked by rendering edges of geometry at higher resolution, hence you had much less jaggies. It was computationaly expensive because of that, textures aren't.

Anyway, why do i even bother when i know you can't take the L.
 

Jenkem

その目、だれの目?
Patron
Vatnik
Joined
Nov 30, 2016
Messages
9,127
Location
An oasis of love and friendship.
Make the Codex Great Again! Steve gets a Kidney but I don't even get a tag. I helped put crap in Monomyth
Christ how bad must the game be to get a 7 from IGN, even Diablo 4 got a 9!

100%. A 7/10 from IGN is a 5/10 for non-shill & non-normies. Given the Bethesda name and IGN's industry ties you know this is a turd from that alone.

IGN are a bunch of snoys so we know why they are giving it a 7, quite telling how all non-American IGNs gave it 9s but only IGN USA gave it a 7 while cyberpoz and balders gayt 3 got 10s...
Outer Worlds was higher rated too.. hmm almost like there's a pattern here.
must not be woke enough
 

mkultra

Augur
Joined
Feb 27, 2012
Messages
493
I keep telling people that modern developers are so retarded that they have to rely upon early 2000s texture sizes to get decent performance by forcing upscaling. Think about that for a second. They can't get the performance from a modern GPU unless they use texture sizes from the early 2000s when early 3D cards were much weaker but still had superior performance.
You mean resolution my dude, not texture sizes.
No, FSR and DLSS both require texture sizes that are roughly what we had back in the 2000s and are upscaled to the proper resolution. In order words, the textures are like 240x240 and upscaled to 1080p and higher. Combine that with the use of fake frames they need to get actual performance.

Texture sizes are based upon the resolution of the texture by the way. ;)

You are wrong here, textures are not smaller using DLSS or FSR. Game just renders at lower resolution, but textures stay the same, it's a propensity of a mesh/model paths and uv mapping which stay the same. It wouldn't make slightest sense to feed DLSS/FSR low res textures, especially since the geometry of the game stay the same.

Have you ever 3d modelled something for a game? Did texture work? You would know that what you are saying make no sense.

Incorrect, the game renders the scene at the specified resolution using textures designed for that resolution then is upscaled. That's what I've been saying.

The entire thing has been that way since the beginning of graphics and there is no fucking way that they're going to change how its done for rendering a frame. It's nearly 70 years worth of processes that is built into everything that is used today. Most games that are 1280x720 i.e. early 2000s only had texture resolutions of 512x512 if you were lucky. Modern games use anything between 1,024 to 3,884 texture resolutions to make the frames.

Yeah if lucky perhaps, more like 128x128 and 256x256. System Shock 2 (1999) has a lot of 128x128 textures for example, not rare to see them in KOTOR (2003) either. I've done many texture mods for both.

512x512 is not especially low and its used to this day for games. It totally depends on what 3D object it is, what kind of UV-mapping it has, (tiling or not tiling). You can absolutely use 128x128 textures in a modern game for e.g a coin texture.
It's kind of useless to talk about unless someone has full understanding of UV-mapping and 3D modelling.
 

Stavrophore

Most trustworthy slavic man
Patron
Vatnik
Joined
Aug 17, 2016
Messages
15,135
Location
don't identify with EU-NPC land
Strap Yourselves In
Texture resolutions of modern games are increasingly higher because:
1. Bigger storages
2. More complex geometry/models
3. Higher resolution displays
4. It just looks nicer when you put your face literally at it ingame

Depending on the object size and uv mapping you use different sized textures. A rock mesh texture will need bigger texture unless its repeated, while a small thing like a needle will need a very small texture. This happens solely because of how much screen space the object can take if you look at it from typical ingame distances.

Idk, why i even explain things that should be obvious to every person interested in gaming or computer graphics, and i expect most people here are interested in these topics.
 

Jenkem

その目、だれの目?
Patron
Vatnik
Joined
Nov 30, 2016
Messages
9,127
Location
An oasis of love and friendship.
Make the Codex Great Again! Steve gets a Kidney but I don't even get a tag. I helped put crap in Monomyth
Christ how bad must the game be to get a 7 from IGN, even Diablo 4 got a 9!
The guy that did the IGN review gave Prey 5/10.

He also gave Alien: Isolation a low score and then admitted after publishing the review that he didn't realize you could SPRINT in the game while he was complaining about "not being able to escape the xenomorph" or some shit.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom