Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

CD Projekt's Cyberpunk 2077 Update 2.0 + Phantom Liberty Expansion Thread

Myobi

Liturgist
Joined
Feb 26, 2016
Messages
1,505
Kek look at them fallout 76 players still bitching about how much other video games suck :D Its like they finally stumbled upon some fucking standards.
 

Perkel

Arcane
Joined
Mar 28, 2014
Messages
16,268
I keep seeing these clips from this garbage and i can't stop wondering how the fuck they managed to make it run like shit? It takes real skill to make something that looks so 2016 demand a 3K dollar Desktop PC to run properly.

The problem is that you have the disease called "being blind". The reason why it works this way is because it is the best looking game that didn't compromise its graphics for sake of consoles. Unlike TW3. Moreover it is false to say that game works like shit on PC. Because even at low you get one of the best graphics and you can still play game with old hardware.

It used to be normal back when PC games were not bastardized versions of console games and your best gpu would be crying just after 2 years. I still remember Unreal and how good it looked on my Voodoo2 and then 2 years later my voodoo started to have issues.
 

Haplo

Prophet
Patron
Joined
Sep 14, 2016
Messages
6,561
Pillars of Eternity 2: Deadfire
I keep seeing these clips from this garbage and i can't stop wondering how the fuck they managed to make it run like shit? It takes real skill to make something that looks so 2016 demand a 3K dollar Desktop PC to run properly.
My computer is 6 years old. Cost me nowhere near 3k. Maybe 1k USD - back then.
Runs Cyberpunk just fine in 1920x1200 resolution and looks beautiful.
Served me well with Witcher 3. Serves me great with Cyberpunk.

As long as you don't need raytracing, 8k resolution or 144kHz refresh, you really don't need expensive gear to enjoy this game.

EDIT: Like I mentioned below, my GFX, 970 GTX, is in fact 6 years old. Rest is older.
 
Last edited:

Zer0wing

Cipher
Joined
Mar 22, 2017
Messages
2,607
I keep seeing these clips from this garbage and i can't stop wondering how the fuck they managed to make it run like shit? It takes real skill to make something that looks so 2016 demand a 3K dollar Desktop PC to run properly.
I take it the artists don't trust tessellation shaders and prefer to pump geometric details by hand instead of relying on some form of mapping, like parallax and displacement maps. That was the case of Twitcher 2 which runs like shit on 2010-2011 hardware and it remains true here. Also, built-in Physx is shipped as is and not fine-tuned for maximum performance like in remedy's Control. And worse, you can't force Physx to run on a jewvidya card, it's all calculated on cpu.
Next is reflections. No, not specific techniques, in general. In general, every surface is considered reflecting something in Cyberpunk game engine, some form of light. In an either very low resolution or some form of interleaved/checkerboard rendering and using software denoiser. Which is bugging on screen-space reflections and eats as much fps as ray-traced reflections. Either that or emulating the ray coherency of reflected light is very hard for poles. Speaking of...
Ray tracing apologistic fucks have it extra framerate fine up their asses because BVH construction is fucked in this game, easily taking up 33% of framerate on its needs because it does that on GPU shader cores.
 

Naraya

Arcane
Joined
Oct 19, 2014
Messages
1,664
Location
Tuono-Tabr
My computer is over 10 years old. Cost me nowhere near 3k. Maybe 1k USD - back then.
Runs Cyberpunk just fine in 1920x1200 resolution and looks beautiful.
Served me well with Witcher 3. Serves me great with Cyberpunk.

As long as you don't need raytracing, 8k resolution or 144kHz refresh, you really don't need expensive gear to enjoy this game.
In general I agree - my own GTX 1060 with a trusty i5-3570k runs Cyberpunk "fine", if by fine you mean FullHD@~30fps.
 
Self-Ejected

T.Ashpool

Self-Ejected
Joined
Oct 19, 2020
Messages
270
Honestly I enjoyed the game and don't think they "need" a No Man's Sky-style overhaul to redeem it and repair their reputation, at least among PC players who are still their bread and butter. 2077 on consoles is about as bad as Skyrim was on the 360/PS3 and consolecucks still jizz their pants over that game. That being said, a lot of the work they'll have to do for Cyberpunk Online will have to be done for 2077 so I'm expecting the more systemic stuff to be patched in over time.

Also I'm under the impression the 2077 team will be split between Online's production and The Witcher 4's pre-production as currently the online team is quite small.

if they split the cyberpunk 2077 team between cyberpunk online and the witcher 4 who is going to make the blood & wine style expansions for this game?
only thing that makes sense to me is some sort of soft relaunch of the game once they re release the game on psn and put out the next gen versions. not because cpdr it might even get a new name like cyberpunk 2077 enhanced edition or something like that.
 

TemplarGR

Dumbfuck!
Dumbfuck Bethestard
Joined
May 30, 2013
Messages
5,815
Location
Cradle of Western Civilization
My computer is over 10 years old. Cost me nowhere near 3k. Maybe 1k USD - back then.
Runs Cyberpunk just fine in 1920x1200 resolution and looks beautiful.
Served me well with Witcher 3. Serves me great with Cyberpunk.

As long as you don't need raytracing, 8k resolution or 144kHz refresh, you really don't need expensive gear to enjoy this game.

A 10 year old PC costing 1k USD cannot run Cyberpunk 2077 at all today, you are lying. If any moron wants to believe you, he just needs to visit wikipedia and find out what kind of hardware was available in 2010 or earlier in that price range. Stop lying you fuck.
 

TemplarGR

Dumbfuck!
Dumbfuck Bethestard
Joined
May 30, 2013
Messages
5,815
Location
Cradle of Western Civilization
In general I agree - my own GTX 1060 with a trusty i5-3570k runs Cyberpunk "fine", if by fine you mean FullHD@~30fps.

You agree with his lies? Your GTX 1060 came out in 2016 IIRC and cost more than 500$ back then, and your cpu+mobo+ram should cost around the same during that time, perhaps somewhat less. This moron you quoted claimed that a 10 year old PC that cost 1k can run CP2077 at 1200p just fine. Which is a blatant lie, hell, gpu hardware from that time period does not even support DX12 at all.
 

Haplo

Prophet
Patron
Joined
Sep 14, 2016
Messages
6,561
Pillars of Eternity 2: Deadfire
My computer is over 10 years old. Cost me nowhere near 3k. Maybe 1k USD - back then.
Runs Cyberpunk just fine in 1920x1200 resolution and looks beautiful.
Served me well with Witcher 3. Serves me great with Cyberpunk.

As long as you don't need raytracing, 8k resolution or 144kHz refresh, you really don't need expensive gear to enjoy this game.

A 10 year old PC costing 1k USD cannot run Cyberpunk 2077 at all today, you are lying. If any moron wants to believe you, he just needs to visit wikipedia and find out what kind of hardware was available in 2010 or earlier in that price range. Stop lying you fuck.

Huh, you're right. I upgraded my GFX 6 years ago. GeForce 970 GTX, serves me well. About 300 USD. The rest of my computer IS much older.
Anyway, no problem to run Cyberpunk on "moderate" settings below 1k USD.
 
Last edited:

Haplo

Prophet
Patron
Joined
Sep 14, 2016
Messages
6,561
Pillars of Eternity 2: Deadfire
My computer is over 10 years old. Cost me nowhere near 3k. Maybe 1k USD - back then.
Runs Cyberpunk just fine in 1920x1200 resolution and looks beautiful.
Served me well with Witcher 3. Serves me great with Cyberpunk.

As long as you don't need raytracing, 8k resolution or 144kHz refresh, you really don't need expensive gear to enjoy this game.
In general I agree - my own GTX 1060 with a trusty i5-3570k runs Cyberpunk "fine", if by fine you mean FullHD@~30fps.

Yep, I don't need 60+ fps in my games.
 

Robber Baron

Arbiter
Joined
Jun 15, 2020
Messages
1,013
Yep, I don't need 60+ fps in my games.

I don't even need 30 fps in my games


8e5cbf9fea5770fc882a5272e649a97d.jpg
 

Gargaune

Arcane
Joined
Mar 12, 2020
Messages
3,634
My computer is over 10 years old. Cost me nowhere near 3k. Maybe 1k USD - back then.
Runs Cyberpunk just fine in 1920x1200 resolution and looks beautiful.
Served me well with Witcher 3. Serves me great with Cyberpunk.

As long as you don't need raytracing, 8k resolution or 144kHz refresh, you really don't need expensive gear to enjoy this game.

A 10 year old PC costing 1k USD cannot run Cyberpunk 2077 at all today, you are lying. If any moron wants to believe you, he just needs to visit wikipedia and find out what kind of hardware was available in 2010 or earlier in that price range. Stop lying you fuck.

Huh, you're right. I upgraded my GFX 6 years ago. GeForce 980 GTX, serves me well. About 300 USD. The rest of my computer IS much older.
Anyway, no problem to run Cyberpunk on "moderate" settings below 1k USD.
Huh... funny that, my old rig had just over ten years under its belt when I retired it in April of last year. Only in-life upgrades were swapping a second HDD for an SSD and a graphics card upgrade midway through, 2700k/8GB/970GTX in the end. I'm almost tempted to unpack it to see how it handles Cyberpunk, but there's no way I'm gonna bother with the cabling just for that.

Anyway, the general consensus seems to be that Cyberpunk "Full HD" experience looks great and runs well on the high-end gaming rigs from up to four years ago, and even the mid-range can still cut it. On newer platforms, raytracing is quite spectacular but it unsurprisingly comes at a considerable framerate cost. At the other end, your mileage may vary on older hardware, but that's quite regular in the world of PC graphics.
 

Lyric Suite

Converting to Islam
Joined
Mar 23, 2006
Messages
58,285
Anybody who accepts anything below 60 fps deserves to be shot in the head. I've been playing at 60 fps for 20 years and i've recently moved up to 144hz where even 60 fps look like shit to me now, there is no way in hell i'll get used to 30 fps just because modern devs AND modern hardware manifactures have turned into incompetent mongoloids.
 

Gargaune

Arcane
Joined
Mar 12, 2020
Messages
3,634
Anybody who accepts anything below 60 fps deserves to be shot in the head.
:negative:

I've been playing at 60 fps for 20 years and i've recently moved up to 144hz where even 60 fps look like shit to me now
I genuinely don't get this tune, is it an acquired taste, some sort of "you had to be there" moment? For decades, my target has been "above 40" and I can't really tell much of a difference beyond that, I'm just as comfortable at 50FPS as I am at 100. In fact, that's Cyberpunk right here, I was running Medium raytracing and realised I could spare the headroom for Ultra and still sit solidly in the 40-50 range even in busy scenes, so I did it. Is it something that I just can't see on my 10 year-old screen, or does it only come into play in competitive online twitch-shooters?

manifactures
Manufacturers.
 

GrainWetski

Arcane
Joined
Oct 17, 2012
Messages
5,366
I genuinely don't get this tune, is it an acquired taste, some sort of "you had to be there" moment? For decades, my target has been "above 40" and I can't really tell much of a difference beyond that, I'm just as comfortable at 50FPS as I am at 100. In fact, that's Cyberpunk right here, I was running Medium raytracing and realised I could spare the headroom for Ultra and still sit solidly in the 40-50 range even in busy scenes, so I did it. Is it something that I just can't see on my 10 year-old screen, or does it only come into play in competitive online twitch-shooters?
I could easily tell the difference between 60 and 100+ in CS 20+ years ago. You could actually 'feel it' while aiming. Literally anything below locked 100 would make it feel like you were in water or something.

Of course, now half the retards are using gamepads for FPS so they can't actually aim or hit anything anyway, so for them it doesn't matter since the game plays itself.
 

|NOVVAK|

Novice
Joined
Dec 12, 2020
Messages
28
I've talked with Gabe and he agreed to offer you an 85% discount on Steam (but only until the 5th of january). GOG agreed as well.

Now talk with Putin&Co, cause they restricted mobile payments for steam. I can't buy anything with my mobile phone, I need to use card, which I want to avoid.
This shit happened 4 days ago.
I've talked with Putin and he said that for buying games on western websites you are now being put on the list of possible western spies. To repent you need to complete CP2077 on a base PS4 and Xbox One while chanting "praise ATOM RPG!".
 

Yosharian

Arcane
Joined
May 28, 2018
Messages
10,446
Location
Grand Chien
Dunno if you guys have seen this yet but basically there is an entire romance for male V and Judy that's fully voiced, acted, sex scene, dialogue that makes complete sense, this is not fake news this is actual finished content that got removed from the game, I'll let everyone draw their own conclusions as to why.



20 minutes of dialogue/scenes
 

DeepOcean

Arcane
Joined
Nov 8, 2012
Messages
7,404
I keep seeing these clips from this garbage and i can't stop wondering how the fuck they managed to make it run like shit? It takes real skill to make something that looks so 2016 demand a 3K dollar Desktop PC to run properly.
Buy a Ryzen 3200G, orverclock it and run it with two sticks of memory for dual channel, lo and behold, you have an original PS4 on your hands. Man, CDPR rushed this release but in reality, they are in shit they cant easily escape from, they will really need to turn the potato game mode on to run on that kind of hardware. Some people say they managed to make Witcher 3 run on the Switch (my calculator has more processing power than the Switch), so maybe they will eventually achieve playable performance. (playable performance for console peasants being 20 fps).
 
Self-Ejected

T.Ashpool

Self-Ejected
Joined
Oct 19, 2020
Messages
270
Dunno if you guys have seen this yet but basically there is an entire romance for male V and Judy that's fully voiced, acted, sex scene, dialogue that makes complete sense, this is not fake news this is actual finished content that got removed from the game, I'll let everyone draw their own conclusions as to why.



20 minutes of dialogue/scenes


unbelieveable man. why would they do this? the only reason i can think of is making judy bi would have meant more options for straight men than for gay/lesbians and you can't have that in 2020.
or this is just one additional last second cut but they won't be able to add the option via dlc now because then you'd have a kotaku article talking about gay erasure or something. now we are left with one choice for the vast majority of their audience who roleplays as straight male V (panam romance or no romance at all) but the resetera types hate cdpr and want to cancel them anyways. i've honestly never seen a company shoot themselves in the foot iike this before. there must some really retarded people making decisions over at cdpr.
 

Lyric Suite

Converting to Islam
Joined
Mar 23, 2006
Messages
58,285
Anybody who accepts anything below 60 fps deserves to be shot in the head.
:negative:

I've been playing at 60 fps for 20 years and i've recently moved up to 144hz where even 60 fps look like shit to me now
I genuinely don't get this tune, is it an acquired taste, some sort of "you had to be there" moment? For decades, my target has been "above 40" and I can't really tell much of a difference beyond that, I'm just as comfortable at 50FPS as I am at 100. In fact, that's Cyberpunk right here, I was running Medium raytracing and realised I could spare the headroom for Ultra and still sit solidly in the 40-50 range even in busy scenes, so I did it. Is it something that I just can't see on my 10 year-old screen, or does it only come into play in competitive online twitch-shooters?

manifactures
Manufacturers.

You need better eyes then as the jump from 60 to 144 was quite evident to me, big enough to make me not want go back if i can help it.

Meanwhile anything approaching sub-40 fps is shiiiiiiiiit.
 

flushfire

Augur
Joined
Jun 10, 2006
Messages
782
I genuinely don't get this tune, is it an acquired taste, some sort of "you had to be there" moment? For decades, my target has been "above 40" and I can't really tell much of a difference beyond that, I'm just as comfortable at 50FPS as I am at 100. In fact, that's Cyberpunk right here, I was running Medium raytracing and realised I could spare the headroom for Ultra and still sit solidly in the 40-50 range even in busy scenes, so I did it. Is it something that I just can't see on my 10 year-old screen, or does it only come into play in competitive online twitch-shooters?
Same experience but it isn't common to find similar opinions, I just think it must be me. I honestly cannot tell the difference between 60 and 75 hz, and although I can tell the difference with 144hz, it's not as life-changing to me as others would put it. Maybe I just got so used to 60hz being a budget gamer almost all my life. Or maybe it's just the games I play. Actually regretted buying into the hype that I sold the 144hz for one with a nicer panel but lower refresh.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom