Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

CD Projekt's CYBERPUNK 2077 Release Thread - The Next Gen Update

Joined
Jan 14, 2018
Messages
44,580
Codex Year of the Donut
I didn't feel like waiting till 2023 for the next expansion, so I figured I'd start a second playthrough and go through some of the side content slowly. Started as a Nomad, ran over the sheriff and the guards with my car. Even though I hadn't even met Jackie at this point, he randomly spawned in my car and began spraying the locals. For some reason, the guards didn't open fire and just walked around yelling as I crashed into them one by one. Thought I'd found a pretty cool exploit when I picked up the rare loot they dropped, only to realize I must be level 30 to use any of them.



Managed to reproduce it for posterity. Also, the driving mechanics have improved considerably as you can see.

Half a billion dollars spent for this.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
1,781
Insert Title Here RPG Wokedex Codex Year of the Donut [Steve gets a Kidney but I don't even get a tag.]
How many samples per pixel to match photogrammetry

Photogrammetry is irrelevant here, you are confusing two separate things. Photogrammetry is used to scan models, the models however can (and really most often are) be lit by the engine.

or texture space shading

Texture space shading is orthogonal to deferred shading. You can have one, the other or both.

at >200meter?

Distance is irrelevant as deferred shading computations are done in screen space so as long as you have the correct values in the g-buffer you can perform shading on them.

That just happens to be available on demand?

Why on demand?

And why would that ever work on consumer cards?

What wouldn't? That stuff is already available and working on consumer cards.
 

tritosine2k

Augur
Joined
Dec 29, 2010
Messages
878
How many samples per pixel to match photogrammetry



What wouldn't? That stuff is already available and working on consumer cards.



Why is this a feature if it "works" ? Neither photogrammetry shimmers and megatexture+streaming enables photogrammetry for example of a complete racetrack.


serbian guy made this in 5 day on bicycle with ghetto go-pro setup.
 
Last edited:

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
1,781
Insert Title Here RPG Wokedex Codex Year of the Donut [Steve gets a Kidney but I don't even get a tag.]
Why is this a feature if it "works" ? Neither photogrammetry shimmers and megatexture+streaming enables photogrammetry for example of a complete racetrack.

Dude, you are jumping from point to point without making any sort of coherent connections between them and then add in unrelated stuff. Stop for a moment and read the entire discussion from the moment i replied (and to what i replied) up to this point and try to realize that you are writing makes little sense.

serbian guy made this in 5 day on bicycle with ghetto go-pro setup.

Sure, but this doesn't have much to do with anything discussed so far except perhaps something that is in your head that you haven't bothered to put it in clear and coherent writing.

My entire point with these replies was that i think it is very unlikely for UE6 specifically to go towards network streaming baked virtual geometry since pretty much everything in the industry nowadays goes more and more towards realtime processing, from the developers' own goals (working in realtime is faster even when making the game) to the hardware itself (HW accelerated raytracing, etc) to pretty much all the research focus around games (see all the presentations in GDC, etc being increasingly about realtime approaches). Yes, you can do it and yes it is technically possible to scan 3D scenes, precalculate volumes/points/voxels/whatever with lighting and stream data that requires very little processing power GPU-wise, this is nothing new, people did it even back in the 90s - it is how the qsplat algorithm for streaming and rendering point clouds was invented - but my point wasn't if it was technically feasible, it was about what i think UE6 would do in the future based not only on UE's but also the entire industry trajectory so far.
 

tritosine2k

Augur
Joined
Dec 29, 2010
Messages
878
entire industry trajectory so far.

https://twitter.com/UnrealEngine/status/1521485488024166407

they put this out today, it's full of remote and streaming references

https://www.theregister.com/2022/05/03/intel_siru_innovations/?td=keepreading-btm
Intel acquires graphics tech biz founded by ex-AMD, Qualcomm engineers
The Twitter account for Intel's graphics team had a slightly less buzzword-filled description of the areas it expects help from Siru: mobility-as-a-Service, advanced driver assistance systems, gaming, and hyperscale datacenters, which are run by large companies like Meta and Amazon.

doesn't seem like they are hell bent on supplying you with the discrete graphics "fix". And that's no use anyway for the average consumer because this "temporal reconstruction" stuff was hardly ever a solution.
 
Last edited:

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
1,781
Insert Title Here RPG Wokedex Codex Year of the Donut [Steve gets a Kidney but I don't even get a tag.]
they put this out today, it's full of remote and streaming references

There is no "remote" reference anywhere in the tweet or the linked page and the only reference to streaming is in a linked PDF about Google Stadia-like streaming and the existing (for UE4 even) plugin to stream data assets - from what i can tell - based on world position. Network streaming of data assets is old news for UE games.

doesn't seem like they are hell bent on supplying you with the discrete graphics "fix". And that's no use anyway for the average consumer because this "temporal reconstruction" stuff was hardly ever a solution.

What does that have to do with anything? What the hell does even "temporal reconstruction" has to do with anything in the discussion?

Dude, i give up, it is technically impossible to communicate with you.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
1,781
Insert Title Here RPG Wokedex Codex Year of the Donut [Steve gets a Kidney but I don't even get a tag.]
at this point I rate the probability that tritosine is a chat-bot made by Carmack at above 40%

I think AI-generated text makes more sense, e.g. this is from here (bold text is a bit of text tritosine2k wrote):

chatbot said:
RTX doesn't hit any sensible framerate at sensible resolution in unreal5 "thanks to" dynamic lighting and that's with cascaded shadows stuff so it's already sub-optimal even below 48 fps and not as smooth distance gradiation as RAGE. Not to mention temporal "reconstruction" .

I'd love to be proven wrong but there are games using AA (f.i. in Skyrim the sky appears to be rendered in AA even though no AA is applied)

So I agree but I'm saying that I doubt the game developers are trying to emulate that for a reason because it looks like shit. I can't tell if the shadow distance is caused by some strange lighting mode but I can assure you it's not what game developers are trying to do.

But I agree it looks shit. Because it's not what game developers are trying to do. Games are trying to render all the details, which results in a different lighting mode that looks like crap if you render all the details at a higher resolution.

So the next step is to try and emulate more details with different lighting modes. It's just a problem that most game developers are not able to come up with a more "advanced" lighting model that looks even better at lower resolutions.

or

How many samples per pixel to match photogrammetry or texture space shading at >200meter? That just happens to be available on demand? And why would that ever work on consumer cards? Why not just take them as is, given that the average consumer probably won't bother to change the brightness and contrast?

If you're asking about the textures on the objects themselves, then if you want a "perfect" texture, you are going to have to use the photogrammetry, because texture space is just a single representation, while the photogrammetry model represents it as a collection of separate samples. This can sometimes lead to slight variations that are not noticeable. For example, a texture pattern that repeats itself over all of the surfaces of a model, may end up appearing more blurry at some places than others. (this actually makes more sense than anything written so far by tritosine)

I don't think the average consumer has the skills or the tools required to do this. Many of us don't have time to even go through those extra steps, but if you want to do it correctly, you have to.

I can think of 3 levels of details:

1. No details (like the default in some games

(yes it stops at 1 :-P though i think that is a character limit from the site)
 

tritosine2k

Augur
Joined
Dec 29, 2010
Messages
878
4444itjic.jpg

What does that have to do with anything? What the hell does even "temporal reconstruction" has to do with anything in the discussion?

Apparently to you neither shimmering or temporal smear is a problem and if that's so I'm not quite sure what you have to do with the discussion either.
 

lukaszek

the determinator
Patron
Joined
Jan 15, 2015
Messages
10,425
to whoever it might interest tested monowire against tech weapon feats and couldnt see much difference.
However, there is this cyberware that heals you when you hit enemy with fully charged tech weapon. That one works with monowire even when its not 100%. Paired with wire being aoe weapon it can heal you nicely.
Get poison feats, poison wire mod and you will be doing quite nicely. Bleeding feats could come into play against bosses i guess... not tested.
 

Gargaune

Magister
Joined
Mar 12, 2020
Messages
1,876
to whoever it might interest tested monowire against tech weapon feats and couldnt see much difference.
However, there is this cyberware that heals you when you hit enemy with fully charged tech weapon. That one works with monowire even when its not 100%. Paired with wire being aoe weapon it can heal you nicely.
Get poison feats, poison wire mod and you will be doing quite nicely. Bleeding feats could come into play against bosses i guess... not tested.
Ah, Cyberpunk 2077 build porn, featuring such prestigious perks as:

Killing an enemy using a yellow tech weapon that has the letter "n" in its name gives you a 7.34(6)% crit chance bonus on your next attack performed within 78 seconds on a target with ginger hair, but only if it's a leap year and it's raining outside. If the attack is a critical hit and you're standing on your head while clicking, each projectile deals an additional Xd6 lead poisoning damage, where X is the square root of the total number of dildos in a six metre radius.

Damn shame they took out the swimming perk, that pissed off a lot of min-maxers.
 

Perkel

Arcane
Joined
Mar 28, 2014
Messages
14,187
Since i am rich now and got 3080 i decided to test the RTX stuff in C77. I finished game 3 times before on ultra on my 1080 that barely could hold 50-60fps at 1080p without RTX.

C77 with raytracing definitely hits quite different at times. At times you experience almost no change but on other times it is huge change. Every if you turn on Psycho which add ray traced Global illumination there is still probe based global illumination and raytracing GI can't edit that.

By far the biggest difference is in the highlights especially when there is huge area light source and how shadows behave. Here is good example:

This is ULTRA:

L3q4t3e.jpg


And this is RTX with all bells and whistles:

xrGI3Z3.jpg


The first thing you can notice is complete lack of shadow. In raytraced one it grounds character. The second thing is lack of shadows coming from clothes. Black shirt is completely shadowless. In raytraced version of game there is distinct shadow that is created from part of jacket onto that black shirt.

Another scene:

This is the worst case scenario for standard restirization mode. Lot of objects that are lit from multiple sides with different amount of light level and via area lights. In this scene character is lit from the back which is correct but also receives yellow light from that light source in up left corner. Problem here is that sun is directional light source that has very limited ability to cast shadows otherwise it would murder performance while that orange light is just a point light.

The scene looks weird because there is distinct lack of correct shadows. The thing is weirdly lit and shadowed.

GvIckaJ.jpg


Now comes RT: Suddenly scene makes sense because shadows make sense.


G6cdlNO.jpg



Another scene. C77 cars have very intricate interior designs which look great.

cl06uSH.jpg


Then you take RTX pill and you suddenly see that it lacked something. Self shadowing and actual GI from outside world:

QILUatz.jpg


Then again there are some scenes in which GI raytracing seems to not work at all and build in probe based GI takes over and you have weirdly lit character.
 

Perkel

Arcane
Joined
Mar 28, 2014
Messages
14,187
Here is another good one. Interiors of cars seems to be places where raytracing "shines" the most.

In this interior everything looks ok, just like in any other game.

8O00DaL.jpg


Then you add RTX and you sudenly see that original picture without RTX is quite different. Legs and floor is shadowed and wheel is in sun which makes wheel centerpiece of screenshot instead of being just part of it:

As7gD3V.png
 

Perkel

Arcane
Joined
Mar 28, 2014
Messages
14,187
Imho when it comes to raytracing it is not the reflections that are gamechanging but how scenes are properly shadowed. That is what gives depth to scene. The current rasterization methods are pretty good but you can clearly see where they fail. Human brain doesn't really care much about quality of reflection as long as they overall make stuff similar to what is supposed to be. On other hand it can easily see something is wrong with depth of image when shadows are wrong.
 

tritosine2k

Augur
Joined
Dec 29, 2010
Messages
878
Imho when it comes to raytracing it is not the reflections that are gamechanging but how scenes are properly shadowed. That is what gives depth to scene. The current rasterization methods are pretty good but you can clearly see where they fail.
nothing special and just as doable without RT, crytek did "per object" stuff like this for cutscenes all the time

7-Figure7-1.png

Soft irregular shadow mapping: fast, high-quality, and robust soft shadows |
 
Last edited:

Perkel

Arcane
Joined
Mar 28, 2014
Messages
14,187
nothing special and just as doable without RT, crytek did "per object" stuff like this for cutscenes all the time

7-Figure7-1.png

Soft irregular shadow mapping: fast, high-quality, and robust soft shadows |

Per object shadows and self shadowing isn't really anything new, like you said crysis did it years ago.

What i am talking about is mostly penumbra and umbra. The shadows in this paper you quoted are made with point light which is why they are possible in restirization. The problem comes when light source is not point like object but has a volume. Area light it is called in rasterization but i don't really know any game that managed to get for those area lights proper "area shadows" They always produce point like source of shadow or no shadow at all.

Diagram_of_umbra%2C_penumbra_%26_antumbra.png
 

tritosine2k

Augur
Joined
Dec 29, 2010
Messages
878
it's done, not a selling point, besides doing it on whole screen tanks perf regardless RT or:



so it's preferably a per-object technique still with scene wide stuff streamed in / precomputed.
 
Last edited:

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
1,781
Insert Title Here RPG Wokedex Codex Year of the Donut [Steve gets a Kidney but I don't even get a tag.]
What i am talking about is mostly penumbra and umbra.

It is possible to have that without raytracing (and a bunch of games already have it), e.g. here is a screenshot of my engine from a year ago when i first added it:

Ueh5rN5.png


However it has several drawbacks, main being that it only works with lights whose area can be expressed as a distance from the lightsource itself (e.g. sphere-shaped where the lightsource is a point light and the distance is the sphere's radius). And of course all the drawbacks it inherits from shadowmapping in general, like aliasing, peter panning, shadow acne, etc.

IMO while GI is nice, the biggest feature for raytracing is shadows that actually work without ghastly hacks on shadowmapping.
 

Perkel

Arcane
Joined
Mar 28, 2014
Messages
14,187
yeah. If penubra and umbra were easy then games would already have it. It is not naturally so games don't have it.

Also making shadow correct is one thing. Making all shadows in scene correct is completely different matter.

Raytracing allows for shitload of dynamic lights with correct shadows it does have heavy impact on scene performance but even greater impact is when you have normal non raytraced lights trying to cast all correct shadows.
 

Sir Crispy

Don't get old.
Staff Member
Undisputed Queen of Faggotry
Joined
Feb 16, 2008
Messages
1,871,619
Location
Future Wasteland
For what it's worth, for every hand-picked screenshot you can post that shows a visual improvement using ray-traced shadows over rasterized ones in CP2077, I can show you a thousand screenshots that virtually look the same between the two (and I acknowledge that you already stipulated that on the last page).

The probem, currently, is that the tradeoff in situational visual improvement vs. in-game performance is far, far too large. To go from 120fps sustained with everything at Ultra other than RTX down to 75-80fps just by turning RTX on is unacceptable. I don't care how good the scene looks; for such a heavily action-based game, to me, you're defeating the purpose of the game by giving up such fluidity. There are certain areas of the game such as the central part of the Japanese area that can bring RTX down to 50fps using a 10900K and a 3090 at 1440p!

So your point is well-received, but unless you look at CP2077 as nothing but a screenshot maker, what's the point? You'll likely never really notice the visual improvements other than in rare situations, all the time wondering why there's microstutter and you're getting slightly motion sick playing the actual game.
 

Yosharian

Magister
Joined
May 28, 2018
Messages
6,579
Location
Hammerfell
For what it's worth the RTX in CP2077 looked more or less the same and not worth the FPS cost at all whereas when I tried it in Dying Light 2 (shit game) the difference was astounding and I couldn't play without it, the game still ran at over 100 FPS though which is a minimum for me
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
1,781
Insert Title Here RPG Wokedex Codex Year of the Donut [Steve gets a Kidney but I don't even get a tag.]
yeah. If penubra and umbra were easy then games would already have it. It is not naturally so games don't have it..

But there are games that already have shadows with penumbras everywhere, e.g. GTA V and Deus Ex: Mankind Divided. They aren't as easy to notice as raytracing's pixel perfect soft shadows because of overall shadowmap problems, but you do get that "contact hardening" effect that penumbras provide.
 

Kjaska

Arbeiter
Patron
Joined
Nov 23, 2015
Messages
1,013
Location
Germoney
Insert Title Here
IMO the difference between RTX On and Off is very significant and worth the performance hit. It's not like the game is particularly difficult that you'd need the 120fps. Besides, the hardware is going to catch up eventually and you don't have to activate it until it does.

If it was some tech that required even more artists to create assets for the game, inflating the costs of production even further, I would be against it. But it's the opposite, potentially automating away a lot of manual work required for creating scenery.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Top Bottom