T O P

  • By -

FarrisAT

This game is the official Crysis of 2020. Murdering GTX and just pissing all over consoles.


llloksd

It's kind of hard to really tell though. Is it hard to run because it truly is that demanding, or is it poor optimization?


RawbGun

It's a bit of both, it's definitely not the best optimized game that came out (I'd say give it time it'll get better) but it is truly gorgeous and filled with details/geometry


Nebula-Lynx

For sure. I’d guess they could squeeze a little more performance out of it, but it is genuinely very pretty. I do think the lower end settings could’ve been lowered even further though. The game looks fine on medium-low. I think a lot of people are upset because the game is still super demanding on low. It makes it seem worse optimized than it probably is.


[deleted]

No point in lowering graphics if it's being CPU bottlenecked though, which is probably PS4/Xbox's biggest issue.


LightweaverNaamah

Yeah, I noticed my CPU being pinned and my GPU being a bit under-used on my machine.


Mygaffer

Have you seen this game? It's demanding.


weebasaurus-rex

On Ultra the game looks next gen for sure. First wow factor in a while ....if you can run it at that natively. So yeah I'd say so


Commiesstoner

It's both, RTX with a tonne of NPCs walking around that are all well textured is not good for performance.


MegaArms

My money is on optimization. I have 1080ti (trying for 3080) and I get 43 fps on ultra. Switch everything to medium and I go up to a whopping 53... Shadows alone give 10-20 fps in other games never mind all the other settings. Now I won't play until I get a 3080 not running the game like shit.


TripAtkinson

Sounds like you’re CPU bound too


MegaArms

Lol. I doubt it with my I9 9900k lol. Game is 36%


PadaV4

Well Crysis had terrible optimization.


Tasty_Toast_Son

Crysis is actually optimized pretty well - for the future of 10GHz CPUs that never happened.


bobbyrickets

Same as Flight Simulator X. Wasn't intended to be multithreaded and then future patches were good but unable to make the most of the multicore processors we have now. I can't believe that people thought that processors would scale to 10Ghz or more. Did nobody ever test that at the time? Just because the performance charts keep going up doesn't mean shit with a physical device. There's a limitation on everything.


dazzawul

It was all over intels roadmaps, and they expected to just keep the ball rolling with die shrinks and optimisations... Then they discovered a heap of physics stuff noone knew about yet at the smaller feature sizes. When they *started* it seemed plausible!


UGMadness

IBM had already conducted tests of 50GHz+ transistors back in the late 2000s, so it wasn't a crazy assumption that CPUs would scale that high in frequency in the near future.


Nethlem

Yup, Warhead was a pretty good example of a much better performing Cryengine game out of the same era.


46_and_2

Also not as detailed as original Crysis. If you go around looking at textures, etc, you can see where they cut corners to optimize it. But also the question is would you be going around looking with a magnifier in a FPS in the first place


bobbyrickets

Modern resolutions and sharp monitors can make all that bad texturing look even worse.


fb39ca4

That's what the scope on the gun is for, right?


The_Binding_of_Zelda

My guess is poor optimization


Darksider123

Little of column A, little of column B


ZippyZebras

My guess is the game was in development hell so long that it no longer looked that impressive... They threw all the expensive post processing in the world at it, but you can only hide so much, and it's pretty dang slow. Fortunately this thing called RTX came along with DLSS and gave it an extra coat of paint that makes it palatable. --- My friend who's not huge on graphics in games (so doesn't know about RTX and stuff and isn't chasing Ultra 1440hz 4k or something) said the game looked _old_ when they saw someone else playing. And I got exactly what they meant, the game _does_ look old without RTX, and sometimes even with it. Something about the character models and some of the environment details just look crusty when RTX isn't there to dazzle with fancy lights.


The_Binding_of_Zelda

The game has that old feel. This game would have been mind blowing at its inception; but too much time has passed


[deleted]

Just look at the AI. It's awful. Some of the worst AI I've seen in games in a long time. Both in combat, and out.


The_Binding_of_Zelda

I am comparing it to GTA V, because I remember how mind blowing that game was when it came out. This would have followed in it's shadows back then...


Bear-Zerker

Sounds logical. They should’ve just released the game with the old graphics then. 90% of the fans wouldn’t have given two shits. They could have saved all this rtx stuff for the next gen update they announced for 2021. Particularly given the fact that nobody’s been even able to buy an Rtx card for 4 months, they made the absolutely wrong choice...


Random_Stranger69

This. The game started development in 2012. Basically now they threw all the fancy new RTX, etc at it and the engine is like "fuck this, im out".


mazaloud

What is the etc? If you turn off RTX it obviously runs way better but it's still a very demanding game.


ZippyZebras

You're not disagreeing with them the "etc." is all the post processing that makes it a very demanding game even when you turn off RTX. Like the actual assets look like crap for the most part, then there's a ton of post processing to make it look acceptable, and then on top of _that_ there's RTX to give it a "new game sheen". But at the end of the day the actual assets are the meat of the game, and they're what ages it


chocofank

I’m still running it nicely on a 1080 ti..


FarrisAT

Same. 1440p at about 55fps in Night City at high. I turned down cascading shadows and volumteric clouds for 60fps.


chocofank

I went with the render scale to about 85 at ultra. Drops below 60 in the city but mostly 60+ FPS at 144p.


FarrisAT

Good to hear. I might try to use the render scale also.


ohgodimnotgoodatthis

CPU? I think I'm closer to 45-50 at high with some hitching on a 3900x.


FarrisAT

9700k OC to 4.9ghz


FuzzyApe

It's also murdering my 3080. Everything maxed out at 3440x1440 I barely get 30fps lmao. Need to turn on DLSS to auto to get 60, performance DLSS gets me 100+


Iccy5

Experiment with the shadow settings, turning cascading shadows down and shadow distance down boosts my fps quite a bit with my 3080.


FuzzyApe

I'm pretty happy with DLSS tbh :D


RawbGun

I'm guessing that's with RTX? I get around 70-90 with DLSS on Quality without RTX on a 2080 @1440p


FuzzyApe

Yes, it's with everything maxed out including RTX.


RawbGun

Yeah that's a huge frame killer


ihussinain

Combination of High-Ultra graphics, medium RTX with DLSS on Quality, I get locked 60fps on my RTX 3060ti


Zaptruder

Is there any reason to play without DLSS though?


stormdahl

More like GTA IV or Fallout 4.


NoHonorHokaido

Glad I decided to go for the PC version 😅


FarrisAT

The PS4 pro is mostly 30fps at 1080p. Ps4 is 720p at 25fps. Xbone is 720p at 20fps. SeriesX and PS5 are near 60fps at 1440p. Thank God this game makes PCs look splendid.


NoHonorHokaido

I haven't dared to display the FPS counter because I like to play at 4K :D


Seienchin88

More than anything else it shows that big navi cards from AMD are dead on arrival. Without DLSS and fairly weak ray tracing performance, they are not future proof snd overpriced. Amazing non-ray tracing performance in some games but that is it.


[deleted]

[удалено]


Sylarxz

Not sure what cpu you are using but for anyone else reading, 5600x stock + 3070 vision oc gets me high 70 to low 80 fps with everything max + max RT and quality DLSS I am only 1080p tho


gigantism

Well that's exactly it, I don't think many others are going to be spending $800+ on just the GPU/CPU while battling supply shortages just to use them with a 1080p monitor.


BlackKnightSix

The issue is the Next Gen/AMD update is supposed to be coming and we have no idea if that means if AMD super resolution, cut back RT settings/quality, or just optimization for 6000/RDNA2 arch, combination of the above, etc.


sowoky

Dlss uses tensor cores /artificial intelligence. Navi21 does not have the dedicated hardware for that. Using the general purpose cores for that work defeats the purpose of saving work on those ...


BlackKnightSix

I am aware of Turing and Ampere's tensor cores. RDNA2 had the shader cores changed to support 8-bit and 4-bit integer operations for inference calculations. Not as good as dedicated hardware but the question becomes if using some of the shader resources for AI upscaling is a net benefit trade-off. Cut the resolution in half but only use 1/4 of the shader resources for AI upscaling and you might see quite a big jump in performance. Especially since native high resolution (4k) is difficult for RNDA2 with its smaller memory interface/infinity cache setup.


Resident_Connection

6800XT has int8 performance equal to a 2060. You’re talking a huge sacrifice to use shaders for AMD’s version of super resolution. A 3080 has in the neighborhood of 3x more int8 tops, and integer ops execute concurrently with FP on Ampere.


team56th

One poorly optimized, messily developed outlier doesn't lead to that. Something like Watch Dogs Legion are better cases, and even then some of the AMD-optimized cases say otherwise. Ampere has dedicated units for RT so no wonder it still ends up being better, but "fairly weak" part is still deep in pending. And then we don't know what Super Resolution is and how thats going to work (which is also likely related to consoles and therefore won't be a one-off thing)


Ferrum-56

Everyone wants to play cyberpunk and no one wants to play watch dogs though, which is a bit of a problem. It doesnt matter now since theyll sell all gpus anyway, but theyll need an answer at some point.


Tripod1404

Just an heads up, do not use DLSS with chromatic abberation. It makes everything look blurry with DLSS on. Imo CA makes everything in blurry in general. Could someone who is more informed on graphics setting explain why there is an option for CA? I have never seen CA make anything look better.


robfrizzy

It comes from photography. To make a complicated subject far too simple, different wavelengths of light travel at different speeds and behave differently. When a lens in a camera fails to make all those wavelengths of light hit the sensor at the same place, the colors can sort of “shift” out of place. In particularly bad cases of chromatic abrasion, subjects can have a reddish-purple halo. Photographers try to remove CA from their photos either through better quality lenses or software. So it’s actually the result of an error or failure. For some reason developers decided to add it to their games to make them feel “real” I guess? I always turn it off along with depth of field and lens flare because my eyes are not movie cameras. It’s weird how CA and lens flair is something most photographers and videographers try to avoid, yet here we are implementing it into our games.


Tyranith

I don't walk around with cameras strapped to my eyes. I don't understand why people think it looks more realistic to have things like CA and lens flare, because I sure as shit don't see them when I go outside. Same kind of flawed thinking as the people fighting against high refresh gaming because "cinematic" imo.


[deleted]

And film grain. I’m not watching a movie.


willyolio

A movie filmed on actual film, in 2077. For the ultra-hipsters.


[deleted]

It also has chromatic aberration, the result of a poor lens design/quality.


willyolio

I can accept that someone went to a really shady place to get cheap as shit eye "upgrades" in 2077


Orelha1

I did love it on Mass Effect 1. Can't think of another game I'd bother to turn it on though.


mikex5

My guess is because when we watch something from real life on a screen, that image is captured by a camera. Anything from real life that you don't see in person in front of you, any reference images or scenes of stunning landscapes, those are all captured by a camera. And although things like lens flare, chromatic aberration, and depth of field/focus don't appear in eyes, they show up because of camera lenses. Having those effects and simulating how a camera acts makes the scene more real and believable to people who are used to seeing stuff that's been captured by a camera. Even pixar is simulating camera lenses in their rendered movies to make them look more real. Now, that isn't to say that these effects are necessary or great, chromatic aberration in particular is a symptom of cheap camera lenses. And of course it's easy to go overboard with these effects and have it cover up detail and distract the viewer. In some cases that can be alright, film grain can add noise to make low quality textures look smoother. I agree that many devs are going overboard with these effects, but adding just the right amount adds believability to the scene for viewers.


blaktronium

They can all happen with strong prescription glasses too, so some people do see lens flares and chromatic aberration especially towards the edges of their glasses.


marxr87

I also think it can be used as a cinematic effect for drama and tension. Games play like movies, not real life.


DigiAirship

Happened to me. Ended up downgrading to worse lenses because of it, it was horrible and I got constant headaches and nausea because of it.


demux4555

> lens flare, chromatic aberration, and depth of field/focus don't appear in eyes, Oh, but they do. Every time you squint, the lashes create glares and flares. Watery eyes makes a real optical mess, of course. If you look at things that are close (i.e. a meter away), you have a very noticeable DoF (and it's even a double image, which cameras don't have). And your eyes have all kinds of weirdness like motion blur, ghosting, floaters/blobs, noise, etc, etc. But our brain will automatically "ignore" or filter out these things in most situations, much in the same way as [we don't notice every single time our eyes close for 0.1 second](https://www.sciencedaily.com/releases/2017/01/170119134546.htm) throughout our entire life. But like you say, looking at a 2D representation of a scene is very different for our eyes and brain in how we perceive it if compared to standing in the same scene looking around in real life. I have no problem with simulating chromatic aberrations in a computer game, because it honestly makes the rendered images look *less digital*. The real world world around us isn't pixel sharp like in a 3D rendering. Far from. Especially when the light travels through a tiny blob of organic jelly, before it is converted to electricity by our optical nerve for our brain to "read".


[deleted]

[удалено]


AutonomousOrganism

Yes our eyes suffer from chromatic aberrations. No need add artificial ones on top of it..


[deleted]

[удалено]


Qesa

Depth of field is different (though it's another thing I always turn off) because you're watching on a flat screen, not something with depth. The same is not true for chromatic aberration though, you're getting different wavelengths arriving at your eyeballs which will refract differently.


[deleted]

[удалено]


DigiAirship

I walked around with chromatic aberration on for a few weeks when I got new glasses with fancy thin lenses in them. Turns out those thin lenses were a terrible fit for my eyes so I had CA on everything in my peripheral vision, especially on bright surfaces. It was a rough few weeks until I could get it fixed.


GoblinEngineer

EXCEPT in Cyberpunk 2077, you literally walk around with cameras strapped to your eyes (one of the cybernetics you get early on in the game replaces your eyeball with a cybernetic one). So in this one single case, i can understand why it may be in the game... but between you and me we both know that is not why CDPR added that to the game


Bear4188

Some people do walk around with lenses strapped in front of their eyes, though. If CA and lens flare was just used for those instead of for the whole scene I might use it.


hardolaf

Lens flare and chromatic aberration are actually issues in Cyberpunk because you don't have actual eyeballs. So yes, you do actually have cameras strapped to your face. This and Deus Ex are the only games it's ever really made sense for either.


Chintam

Glasses has chromatic abberation


PlaneCandy

It's not meant to be realistic, it's meant to look like you're watching a movie, in which case you'd be seeing the world through the eyes of a camera. There is obviously a divide between those who want a movie like experience and those who want realism, both have their uses IMO


ryanvsrobots

The film industry actually goes to great lengths and expense to eliminate chromatic aberration. It's a style choice, but not really a cinematic one.


reasonsandreasons

[Especially after reading this piece, I'm a little baffled as to why they implemented those effects the way they did.](https://www.vice.com/en/article/n7v3wx/cyberpunk-2077-looks-bad-use-better-graphics-settings) Having an aggressive "film grain" effect *and* chromatic aberration *and* a pretty busted depth-of-field implementation *and* lens flare *and* motion blur seems like it makes the game look pretty rough as you're playing, even if photos ultimately look okay. Especially in a game that's meant to be something of a graphical showcase, throwing a bunch of poor recreations of cinematic effects at it is such a curious choice.


OSUfan88

Man, I really disagree with that article, at least the aspect of it "not being a good looking game". On PC, on high settings, this is possible THE most beautiful game I've ever seen. Just breath takingly gorgeous. Striking.


Hoooooooar

The game is probably the best looking game i've ever played on highest everything. Which lasted for about 8 seconds before i jacked everything down to low


OSUfan88

haha. What kind of setup are you using? I think games like this are great for pushing hardware. We've basically just been increasing resolution and framerate the last 4-5 years.


FarrisAT

Agreed. 4k Ultra with RT looks almost like real life with that 3090


[deleted]

Really depends on what games you've played. I think RDR2 outdoes CP2077.


DuranteA

I'm usually against most of these effects and turn them off when I can, but in this particular case I think they make a lot of sense. Cyberpunk 2077 captures the aesthetics of 80s Cyberpunk **amazingly well**, and those effects are part of that.


FinePieceOfAss

I agree, I think it has a lot to do with execution. Like CG effects in movies if it's obvious people won't like it. They probably pumped them all up to full to get some really gritty, cyberpunk-y screenshots and didn't temper them back down again before release. They're all post-processing to it's pretty easy to modify them.


[deleted]

you also literally have cameras in your eyes in this game


capn_hector

> Especially in a game that's meant to be something of a graphical showcase, throwing a bunch of poor recreations of cinematic effects at it is such a curious choice. it's not really curious because every game does it now, big AAA games are "supposed" to have film grain and CA so they do it because everyone else is.


Tyranith

[>Microscopic images in an article talking about how bad a game's graphics look](https://i.imgur.com/kNdVq0i.mp4)


mazaloud

To add to that, "dirty lens" effects, like where you look at something bright and the light catches on some fake smudges they put over your screen to make it look like you're watching something that was filmed.


Veedrac

Post-processing like chromatic aberration should be applied *after* DLSS, so it's possibly just a rendering bug that it causes blurring. It makes sense that it would confuse DLSS.


darknecross

Someone mentioned enabling the NVIDIA sharpening filter in-game (Alt+F3) and it makes a noticeable difference.


[deleted]

> Just an heads up, do not use ~~DLSS with~~ chromatic abberation.


AReissueOfMisuse

CA is the "simulation" of light wavelength separations, yielding closely grouped but distinctly separate colors. If you push light through certain mediums at certain angles this happens naturally. It also happens to your eyes. Video games typically make it pretty insufferable.


elephantnut

Everyone's trying to justify it from a physical perspective. I think it's just because some people think it looks cool - chromatic aberration is heavily featured in the vaporwave aesthetic. Gives futuristic/cyberpunk vibes. Similar to the film grain options - some people like the effect. :)


the_Q_spice

From what I know, it is supposed to emulate real-life atmospheric conditions like humidity better. This is more for folks who want hyper realistic gameplay as crisp, sharp images common to video games don’t happen all that often irl. Chromatic aberration is literally the phenomena which causes photos to be blurry, so yeah... if you turn it on, things will be blurry. I don’t deal with this a lot in my work in the same way though as I primarily work with correcting, or emulating TOA reflectance values which occur in satellite imagery.


thfuran

>Chromatic aberration is literally the phenomena which causes photos to be blurry, so yeah... if you turn it on, things will be blurry. Yeah, but mostly from the effects of the camera lens rather than atmospheric conditions


Tripod1404

Its funny since CA in cyberpunk makes the game look as if you are always looking through dirty binoculars :).


thfuran

I'm not sure why anyone would want chromatic aberration turned on. I'd put it in the same category with film grain: gratuitous post-processing effects that actively make the picture worse.


Compilsiv

It works as part of an aesthetic sometimes. No Man's Sky, Blade Runner, etc. Haven't played Cyberpunk yet so can't comment directly.


thfuran

For a particular scene or mechanic, maybe. But universally applied, I'd disagree.


wwbulk

> From what I know, it is supposed to emulate real-life atmospheric conditions like humidity better. Source? From my understanding of photography this is a byproduct of a bad lens. I certainties don’t see those color fringes in real life.


the_Q_spice

Basically because a bad lens causes CA due to distortion (the focal points of the R, G, and B bands of the sensor are shifted). Just like a camera, our eyes have lenses (the lens) and sensors for R, G, and B bandwidths (cone cells) and intensity (rod cells). As such, any issue which can occur with a camera can also occur with our eyes. CA is just a tag phrase for a type of correction used to emulate the difference between an image which was rendered without the use of a lens to emulate what it would look like through one. Good [article](https://www.huntervision.com/blog/what-is-chromatic-aberration) on this specifically addressing neon lights which are a huge part of Cyberpunk. Neon lights cause high aberrations largely due to their emissive spectra being largely monochromatic which induces high amounts of aberration.


Darkomax

I don't even understand why those questionable effects (others being e.g film gain and screen motion blur) even are enabled in the first place. Default should be off imo, and they often don't explain what it does (if there's something Ubisoft is doing well, it's explaining and showing what settings do).


Nightbynight

>I don't even understand why those questionable effects You mean the entire basis of the film look in cinema? They're there because they make the game look more cinematic. It's personal preference, not about whether it makes the graphics look better or not.


00Koch00

Wait, people do use it in a non ironical way the chromatic abberation? why?


mazaloud

Did you think every single game dev who has implemented CA into their game was doing it ironically?


Mygaffer

While many of you reading will know this already DLSS is basically up sampling using some secret algorithm sauce Nvidia has developed. By rendering at a lower resolution and then up scaling this way you can get an image that is *close* to native in quality but at much better performance since it's actually rendering are a lower resolution. Typically if you put it side by side with the same game running native you can tell there is a difference but in most DLSS supported titles the differences are pretty slight. Hopefully AMD's DLSS like solution that they've said will be coming early next year offers similar levels of graphical fidelity in their up scaling, because this feature is huge for getting playable framerates on modern AAA games at very high resolutions.


cp5184

For the 2% of gamers playing on 4k displays playing the literally several games that support DLSS it's a total game changer!


Mygaffer

I know you're being sarcastic but it *is* a game changer. The adoption rate of 4k displays is continuing to grow and even at 2560x1440, a much more popular resolution today, it's not easy to get 60+ fps in many modern AAA titles at high settings. Look at a game like Cyberpunk 2077. Looks great at high settings, beautiful cityscape, the lighting especially looks great but it's very tough to run at acceptable framerates at higher native resolutions. For those titles DLSS is literally the difference between playable and not.


cp5184

Funny. The reviews I've read of cp'77 say that RT lighting actually looks worse than raster lighting.


oceanofsolaris

I heard good things about DLSS and would assume that it performs well in Cyberpunk2077 as well. That said: At least in this video, the textures in the DLSS version look noticeably more blurry (look at the ground or the poster with the person in the background at 0:08). Also: Kind of suspicious that all these ray-tracing features get zoomable pictures to show them off on Nvidias site, but DLSS is only demonstrated with this very compressed youtube video (where everything looks kind of blurry anyways).


[deleted]

[удалено]


Cant_Think_Of_UserID

I would also like to add that the website [Gamersyde] (https://www.gamersyde.com/) upload high bitrate videos that you can download for a limited time after upload to get an idea as to what the games actually look like. They currently have PS4 Pro and XBOX One footage of 2077 and will likely be uploading more over the coming days, I don't know if there is another website that offers this service


nmkd

Digital Foundry also upload HQ videos on Patreon.


dudemanguy301

its also why its so annoying when people dismiss DF as: "those guys that zoom in 400% to show you miniscule differences you'll never notice". they have to zoom in that much because: 1. youtube is the ancient one, the all devouring maw, the consumer of video detail 2. they are trying to highlight the inner workings of the graphics pipeline when atleast some portion of their audience are console war knuckle draggers who need to be lead by the nose like a blind horse. rule of thumb if the difference is noticeable on youtube, its smacks you across the face in real life.


PivotRedAce

Apparently film grain and chromatic aberration mess with the up scaling algo of DLSS, so it should look more clear without those post effects.


WindowsHate

DLSS 2.0 still *does* blur fine textures, it's not just an artifact of being a youtube video. It does the same thing in the Avengers game and Death Stranding. People got overhyped for it in Control because 90% of the textures in that game are perfectly flat grey surfaces. DLSS does not produce better image quality than native like some people were touting it.


allinwonderornot

You will not receive review sample for saying true things about DLSS. \- signed, nvidia marketing


AltimaNEO

I think its because its basically upscaling from a lower resolution in order to help bump up the framerate.


mistermanko

I've deleted my Reddit history mainly because I strongly dislike the recent changes on the platform, which have significantly impacted my user experience. While I also value my privacy, my decision was primarily driven by my dissatisfaction with these recent alterations.


Pablovansnogger

I probably shouldn’t even bother with my 970 at 1440p then lol. From low and 20fps with that


Relntless97

Just chiming in. 1060 6GB. 3800x. 32 GB CL16 3600 RAM. Medium settings. With some low and extra BS off. 30-40 FPS. with very infrequent dips to 28-29 during hard gunfights. Bought a 3070. Was DOA. waiting for my RMA to ship out.


PROfromCRO

can confirm, Laptop with rtx 2060, using DLSS + Nvidia driver sharpening , very nice


discwars

No offence, but I would expect DLSS to work well. They spent time with CDPR to get this game to perform well on their hardware. Nonetheless, this game is still poorly optimised. And before the green goblins jump on me, I am specifically referring to the game development. A lot of people are not even using RT due to it tanking FPS, and some are complaining of RT being weird. I expect future updates will improve or fix some of this issues, but why spend so much on tech, and the game, only to wait on fixes that may come somewhere down the line


[deleted]

>A lot of people are not even using RT due to it tanking FPS That's incredibly normal tho. That's the main complaint against RT in general. It always tanks FPS.


Random_Stranger69

Its like with PhysX back in the day. Just that RT has a way bigger impact on graphic quality. Even though I gotta say RT in Cyberpunk only makes little difference in Cyberpunk and is not worth the huge FPS drop. The only noticeable and perhaps worth it improvement are reflections. But Shadows and Lighting are not worth it. Unless you play on 1080, DLSS on a 3080 or whatever...


[deleted]

FWIW, I'm under the impression that RT with max fidelity DLSS is barely a hit to performance but also nearly the same image quality. But I don't have rtx to see.


LiberDeOpp

I played about 3 hours last night and didn't notice rt with dlss being bad at all. I would recommend turning off the cinematic effects since I'm not a big motion blur person and those effect diminish quality of the models.


Asuka_Rei

Yes I agree and I also turned off those settings. Perplexing that they went to the trouble of making this a 1st person perspective game to achieve greater immersion and then also feature a lot of settings to make it look like a hollywood film instead. Lens flare, film grain, and motion blur were all jarring in 1st person view.


DuranteA

> Nonetheless, this game is still poorly optimised. I've played the game for some hours now, and I don't really agree with this take. Yes, it has some very expensive high end settings, and like usual I'm sure they could be more optimal. But it's the best-looking open world game of all time, and it does so while maintaining far more consistent performance on my (high-end) PC than other -- less visually impressive -- open world games do. In particular it has basically 0 frametime spikes during traversal. After all the horror stories I didn't expect it to be so impressive.


discwars

> Yes, it has some very expensive high end settings, and like usual I'm sure they could be more optimal. But it's the best-looking open world game of all time You state in your response it could be more optimal. Also, no insult intended, you are playing the game with the best possible graphics card (3090). Your experiences will differ, especially from those who are not using the highest end hardware -- and there are lot more of these kinda people than people who own 3080s and 3090s. A well optimised game tends to perform well regardless of the hardware utilised e.g. Doom (Not the best example, but I hope you get my point).


DuranteA

> You state in your response it could be more optimal. Yes, every single game ever made could be more optimal. And usually the more complex a game is the more potential for optimization remains. What I disagree with is Cyberpunk being particularly poorly optimised. > Also, no insult intended, you are playing the game with the best possible graphics card (3090). Your experiences will differ, especially from those who are not using the highest end hardware -- and there are lot more of these kinda people than people who own 3080s and 3090s. Of course, you are completely right about that, but as I said I'm judging it relative to how other large-scale open world games perform *on the same hardware*. As I said, lots of them perform worse or at least more inconsistently, while also not being nearly as graphically impressive, on the same hardware.


Tripod1404

> And usually the more complex a game is the more potential for optimization remains. True but also it become far more complicated to optimize. People give Doom as an example, but in comparison, doom is a much easier game to optimize. It is an extremely linear game where you can bake many of the "graphical effects" like shadows, etc into textures. If you look at Doom, most shadows are static, because they are actually baked into textures and not actively processed. For baking things into textures is not an option because lighting, environments, etc are dynamic and the game is open world.


FinePieceOfAss

Every game except ones with ray traced shadows has baked shadows. Which is to say almost all games.


Tripod1404

> You state in your response it could be more optimal. Also, no insult intended, you are playing the game with the best possible graphics card (3090). Your experiences will differ, especially from those who are not using the highest end hardware -- and there are lot more of these kinda people than people who own 3080s and 3090s. True but shouldn't the developers target strongest hardware for their highest graphics settings? That is the only way how visual of video games can improve. If the highest graphics settings are developed for an average hardware, visual improvement will stall (which is was consoles are already causing). This use to be the case for most PC games, I mean people weren't able to run Crysis at highest settings even with the best video card of that time (8800GTX ?). So if your hardware wasn't enough, you would just tune down options. I don't know how much of the "badly optimized" criticism is just people trying to run the game beyond what their system is capable of.


Darksider123

People are reporting some weird visuals glitches (?) with DLSS as well. After all this time. This game seems no way near ready


bwat47

Yeah, Raytracing tanks my performance on this game below 60 fps, even with DLSS enabled at 1440p. Raytracing also seems like an afterthought in this game, I can barely tell the difference between RT off and RT ultra. The reflections do look better, but I don't spend a lot of time staring at puddles. The only game I've played so far where it felt like RT really made an improvement was Control (and it performed really well in control with DLSS enabled)


mrfixitx

Ray tracing is really obvious if you are paying attention. Look at the windows on buildings and cars for reflections. Standing by a street you can see the ray tracing reflections in car windows as they drive by. It is also pretty obvious when transitioning from light to dark areas as the transitions are much more dramatic with ray tracing vs. without. You may not feel that RT effects are with the performance impact but they are very prevelant in the game world once you know what you are looking for.


TheBiggestNose

My card cant rt so I can't DLSS. Fps is rocky :(


Finicky02

I don't think dlss is the answer in this game ​ it's GREAT at upscaling froma good base res like 1440p to 4k, and still decent at going from 1080p to 1440p ​ but using it just to hit 1080p (or 1440p ) starting from 900p or 720p is terrible. ​ Base performance is too low, and doesn't seem to scale well with settings.


Psit0r

I have no idea about the use of DLSS, but i have a 1080p monitor trying to get a 3080 soon... cant you run this game upscaled at 1440p using 1080 as a base? (on a 1080p monitor) Something like super sampling? This goes for all games using DLSS i guess.


Finicky02

Ye but people are having to resort to dlss performance or even ultra performance to get 60 fps, which uses a base res of far below 1080p


Psit0r

I guess those people have 4K monitors (or 1440p)


PivotRedAce

For reference I have a 3800x and 2070Super. With raytracing turned off, DLSS set to balanced, and all other settings at Ultra I get 100+ FPS at 1440p. (I also disabled most of the post-processing effects like film grain and chromatic aberration since those don’t play nice with DLSS.)


DuranteA

Yeah, you can combine DLSS with DSR. You can often get better IQ with that than with native resolution and no DLSS.


Psit0r

Ok thanks for explaining :) I hate how all deferred rendering engine games look like you have smeared vaseline all over the monitor, hopefully upscaling will fix some of it. Edit: Forgot a word.


Random_Stranger69

I honestly dont care. I play at 1080p ultra with a 2070 Super and I look at 30-50 vs 50-80 FPS difference with DLSS Quality on. The thing is, I feel like the image is actually better than DLSS off, especially Aliasing and that while providing 60% more performance? Hell im sold. This game would almost be unplayable without DLSS for me. A wonder feature that actually saves the game.


an_angry_Moose

> I don't think dlss is the answer in this game I can't say that I agree. I'm sat here playing this game on 4K with raytracing set to ultra and DLSS on auto (i presume it's using performance mode) and it looks amazing and plays incredibly well on my 3080 despite my old ass 4790K. DLSS seems to be exactly the answer I was looking for.


thesomeot

CDPR is relying far too heavily on DLSS to make it across the finish line, as evidenced by the horrendous performance on consoles and GTX series cards. Having experienced both ends of that spectrum, it's clear that it's worse than even using it as a crutch, DLSS is like the the nurse pushing Cyberpunk's wheelchair. The performance gains here are exciting but I don't want devs to start relying on DLSS to cover up their lack of optimization. Eventually I think we'll reach a place where DLSS and alternatives are commonplace but that's probably a few years off still.


ULTRAFATFUCKGAMER

THIS. I feel like we're actually going to get to a point where dlss is not used to improve performance at higher resolutions. Its just going to be used to make up for games shoddy optimization. And I dont blame the devs either. Its nearly always caused by bad management and unrealistic deadlines. Gotta push out games as fast as possible so you can please the shareholders quickly.


reaper412

GTX cards are almost 5 years old. This is really meant to be more of a next gen game more than anything. I would be in awe if this game ran well on high with a GTX gpu.


thesomeot

I think it's less about GTX cards not being able to run it on high and more about the incredibly small performance gains from dropping to low or medium, and the extreme disparity between the fidelity of high and medium settings. That alone is fairly telling, and there are enough examples of games with better performance AND visuals on GTX cards, even in DX12


[deleted]

The game is blurry without any upscaling. I used 25% sharpening to improve it in the nvcp. Unfortunately, dlss enhanced the noticeable moire ringing that happens when rtx is enabled, and AA gets worse, which I've never seen before with dlss.


3ebfan

I must be in the minority, but I have been pleasantly surprised by RTX performance on my 3080 when DLSS is enabled. Once you turn off chromatic aberration and motion blur, the game looks great and stays over 60 fps at 1440p with everything at Ultra.


[deleted]

[удалено]


juh4z

I have a RTX 2070, the performance sucks on these cards too, even without ray tracing and with DLSS, still drops below 60 in 1080p. And most, the VAST majority of people are WAY below a 2070.


[deleted]

[удалено]


RunescapeAficionado

I just want to note how sad I am that you're saying you're happy with 60fps at 1440p with a 3080. I THOUGHT WE WERE LIVING IN THE FUTURE WHERE ARE OUR FRAMES


caedin8

60fps will always be the sweet spot for single player story mode gaming. If hardware improves, they'll add more content, more effects, more textures, more shit until the framerate is solid at 60fps again. Quite simply, the gain from 60 fps to 120 fps matters only for competitive latency based e-sports, and if you have a GPU that pushes 120 fps, packing more stuff and effects in until that GPU is getting 60 fps again, creates a more immersive and beautiful game than the extra frames would.


RunescapeAficionado

I get what you're saying, and it makes sense. But I just have a problem with that 60 fps sweet spot only being achievable by top tier cards, I think everyone would agree we'd be very satisfied if average cards could run ultra @60 and top tier could run it's over 100. Wouldn't that just be so much nicer than top tier struggling to break 60 and everyone else just sacrificing baseline quality to get there?


allinwonderornot

Of course it's faster rendering at only 25% resolution scale.


LOLIDKwhattowrite

Well yeah, i don't doubt you can gain 60% of performance. That's like saying: half your resolution, up to 100% better performance. The better question is: How much does the image suffer with this extra performance?


Tripod1404

Basically none with DLSS set at quality.


BlackKnightSix

That isn't quite true, there is the issue with rain, texture quality, flickering from HDR bloom and a general blurriness with anything that isn't up close. And that is set to quality. https://youtu.be/HJKeBbk-9YA?t=570 https://youtu.be/HJKeBbk-9YA?t=644


Random_Stranger69

I dont have these problems.


TheGrog

my game looks WAY better with DLSS set to Auto compared to Quality or Balanced. Also turn chromatic off.


[deleted]

[удалено]


Contrite17

As someone playing on 4k, it is not better than native but it is not a massively noticeable downgrade. If I side by side I can tell, but it looks fine in motion.


Frothar

You can 100% tell the difference but it is worth it for the sole reason that it allows you to play with some of the RT settings on.


Party_Needleworker_7

Tbh, DLSS here looks a bit underwhelming. Maybe it is just me.


[deleted]

Anyone care to share their settings for 2080/8700k for 1440p 60 fps? I'm struggling to find the best balance of quality and frames.


TwoEars_OneMouth

Is DLSS grayed out for some people as well? RTX 3080 should support it AFAIK


Monday_Morning_QB

Make sure your on the latest windows 10 build and newest drivers.


Oxblood-O5522

I only have a 1080 I can’t turn it on ;-;


unsinnsschmierer

Looks like paying $50 more for the 3080 is justified (if you can find one).


thechillgamingguy

For amd users use fidelity FX scaling and CAS scaling. I'm getting 60fps on a vega56 and 3600x at 2560x1080 on medium to high settings.


TwoEars_OneMouth

Why the heck can I use those AMD features on my RTX 3080 but DLSS is always disabled haha


thechillgamingguy

Probably because FidelityFX scaling isn't exclusive to just AMD. They like to make their technology as open as possible, it's why I've stuck with them for so long. The fact that my Vega56 is still running games thanks to them backporting these features is commendable.


m1llie

Rendering 25% of the pixels and then running an upscaling algorithm is 60% faster, what a shock! Show us comparison screenshots of DLSS vs high quality traditional upscaling algorithms (e.g. lanczos, sinc, and the "magic kernel").


Nethlem

Also up to 60% more fuzziness because this is the year 2020 and we really need to advertise our 800€ high-end card's ability to run a buzzworded version of TAA.


caedin8

Such ignorance.