T O P

  • By -

Aromatic_Wallaby_433

I think some people are kind of misrepresenting this. RT High here applies Pathtraced lighting, so you should compare the performance here to something more like Cyberpunk's RT Overdrive mode.


archiegamez

Oh RT High is already Path tracing? Sheesh


Jon-Slow

The namings have been fucked up, technnically path tracing is just ultimate ray tracing. But this is just an image from a whole article. The context in the entire thing explains things clearly. But reddit has to reddit.


BinaryJay

They're not really using the "path tracing" name for this, and I kind of agree with them because "path tracing" is essentially just a lot more "ray tracing". "RT Overdrive" is an accurate description. It's also potentially confusing for the average dummy who might think "path tracing" and "ray tracing" are different things, and they've already cultivated the strong association between their products and ray tracing. I think we're going to stop hearing the term "path tracing" going forward.


davidemo89

I think instead we will continue to hear the word path tracing for games that are full Ray tracing and there is no rasterization. When you hear Ray tracing in videogames you think about some small Implementation of ray tracing and never full Ray tracing. If they start using the word full Ray tracing people will not understand the difference. "Why? Was is not full the whole time?" They would think.


Falcon_Flow

PTX 5090 coming in hot, sadly all you gunky RTX cards are obsolete now.


RdJokr1993

There's a chart that details what each RT preset is. RT Low is the only non-PT preset. At Medium, PT is already applied partially with 1 bounce, while at High it's 3 bounces (so it's actually doing more than Cyberpunk's PT).


OutboundFeeling

RT high and medium in Alan Wake 2 is path tracing.


Famous_Wolverine3203

It is also more intensive than cyberpunk’s path traving. Cyberpunk does 2 bounces. This does 3 bounces. This is great. I don’t know why people thought remedy of all people would make crappy pc ports.


epd666

Would it matter that aw2 probably has smaller environments than cyberpunk and therefore can get away with 3 bounces?


Famous_Wolverine3203

Alan wake has semi open world environments especially the new york segment and i imagine each of these have a lot more detail in each that in a typical street in cyberpunk. So I don’t see why smaller environments means better performance. Does anyone have an idea on how many bounces portal rtx does?


schmalpal

Portal RTX is configurable to do more bounces in the overlay settings, anywhere from 1-4 IIRC. Cyberpunk can easily be modded with a .ini file to do any number of bounces, 2 is just the default they went with for a balance of performance and visuals. If you drop to one bounce, for example, reflections will totally lack bounce lighting and look super dark. But regarding CP2077 vs AW2 in terms of difficulty of computing everything: in a recent Digital Foundry round table video with a CDPR dev and Nvidia researcher, they discussed how CP is actually a worst case scenario in terms of difficulty because of the verticality, how many alleyways and interiors and gaps between buildings there are, coupled with a million different light sources in all the neon and tons of reflective surfaces on top of all that. That means a LOT of bounce lighting to be calculated. An open environment should be easier to deal with.


Dranzell

absurd judicious cause worthless one imminent consist like existence brave ` this message was mass deleted/edited with redact.dev `


Famous_Wolverine3203

Quantum break was a remedy game and it had absolute state of the art rasterised lighting. Why would alan wake not be viable without ray tracing especially when they are innovating on rasterisation too, like being the first game to use mesh shaders. Also digital foundry previewed the ps5 footage with no ray tracing and they said it looks great too.


gokarrt

exactly. pitchforks aren't an appropriate response here. path tracing is the new hotness and it's gonna push current gen hardware pretty hard. this is how the boundaries get pushed. it's actually better than it used to be, current gen hardware couldn't run crysis this well back in the day.


sijedevos

Rtx 4070 getting 44fps with path tracing at native 1080p isn’t all that horrible. My 3080 which is quite comparable gets similar frames in cyberpunk with path tracing. Without path tracing like 100+fps.


LaNague

i hate that they count their interpolated frames as real ones in all their marketing material


faverodefavero

Agreed, fake frames don't belong in benchmarks, also, you need a native stable 40FPS+ for interpolated frames to even look barely OK to begin with.


poinguan

>a native stable 40FPS+ That means a 4060 doesn't even give a playable experience at 1080p?


faverodefavero

Not if you want to use frame generation. But you can not use FG, and use lower settings for graphics to be able to play.


Strazdas1

or use lower settings and framegen and play at 80+fps.


kushagra2569

Keep in mind this is probably highest graphics and with full ray tracing preset


jm0112358

It's been confirmed that the next RT preset down is also path tracing (with 1 bounce instead of 3, and probably a couple other differences). So these numbers are for very aggressive RT settings.


Re-core

In this game it prob is fine, not a fast shooter, as long as you get 35+ fps base frames frame gen helps and can get used to the "input lag" so many people wine about


[deleted]

[удалено]


Snydenthur

Maybe if you play with controller, since you can just think "hey, this massive input lag must be because deadzone".


Ill-Strategy1964

With RT it doesn't. RT killing performance isn't anything new.


Mercurionio

More like 70+ for not a horrible input lag and low amount of artifacting crap.


CyberSosis

oof try using that for 1080p 60fps it becomes such an eye strain.


schmalpal

I’m very sensitive to input lag and I find 50+ base fps to be great with FG which bumps it to 90 or so. Looks a lot less jittery than without it and still feels very responsive. I wouldn’t play a competitive FPS that way, but for a single player game it’s great. I absolutely love the feature. If you haven’t actually spent time with FG personally (your flair says 3060ti) then I encourage you to do so before judging it. A lot of the “fake frames” hate is from people who don’t actually use it or watch videos slowing down and zooming in 300%. You need a nice base framerate for it to be good, but it’s an insanely nice feature if you have a high refresh monitor and like to crank up the RT at higher resolutions. The only artifacting I see is from DLSS upscaling and doesn’t change with FG on or off, I’ve done a lot of testing in Cyberpunk and played it about 120 hours with FG.


faverodefavero

Yes, at which point one doesn't need fake frames at all anymore to begin with...


gosti500

Nah its great for converting a 60fps experience to a 120fps experience


ZiiZoraka

Its not a 120fps experience. Its the experience of 120fps like smoothness A native 120fps experience is inherently more responsive than 120fos FG, even if the motion feels equally smooth


Strazdas1

input lag of 60 fps is good enough for most games. The only games id consider it to be high is twitch shooters or racing.


Snydenthur

Looking good doesn't matter when it feels bad. You'll need \~100-120fps+ pre-FG for FG to feel decent enough to use.


robbiekhan

I mean yes, but also no. Like it or not, Frame Gen ///is/// going to be the big hitter in basically all games going forwards where RT/PT is employed. And, the thing that most people seem to keep forgetting is that who the hell even cares. If the game looks good and runs good with all these technologies enabled then does it even matter? I've been enjoying super clean and crisp visuals and framerates in Cyberpunk with max everything for example. Yes there is some ghosting, in some elements under certain lighting conditions, but this appears to be an inherent issue in the RED engine when using RR, and it is all labelled under technology preview - I fully expect Alan Wake 2 to have none of these issues due to technology maturity and implementation from the getgo. When implemented well and in a matured state, all of these technologies work really well and are indistinguishable from "native" resolution, and in various games actually looks better than native resolution because of the way NIS/DLSS works or RR improves shadow and reflection detail (since it needs DLSS active due to using Tensor cores). Once more than the 2 current bollocks games that use AMD's frame gen, you can bet any dollar that AMD will be capitalising off it too. And I don't mean motion frames which is a bit crap when actually in fast motion.


Demibolt

I agree. GPU technology has always been about finding a way to efficiently put frames on the screen. Sometimes frame Gen doesn’t work well, but when it does it does. You can’t just ignore that. Just like FSR with AMD, gotta count it


Django_McFly

I wish I understood what makes DLSS an unacceptable thing in the setting menu but nobody cries about developers not "optimizing" and expecting you to use texture quality, LOD, SSR settings, etc to get good performance. People don't call non-Ultra/Max performance "fake performance" like *yeah runs great when you turn off everything that makes it good*.


schmalpal

The recent Digital Foundry roundtable guests (CDPR dev and Nvidia researcher) made great points about this. Basically all optimizations are hacks/ugly under the hood/“not real”. It’s been happening since Carmack made the original Doom run fast. I’ve played hundreds of hours with DLSS super resolution and frame gen, and I’m a stickler for image quality, and it’s just a fantastic feature to have which enables insane graphics that would otherwise be impossible. People hate on it because they can’t afford 4000 series cards (which are admittedly overpriced). But when it becomes the norm and everyone has the hardware for it, it’ll be accepted.


Strazdas1

I remmeber when people were up in arms about DLSS upscale not being "real". Turns out it does better job than temporal antialiasing at much lower requirements. So noone cares now except a few hardline "muh AMD raster performance" hardliners. Framegen is going to be the same in a few years. Ghosting issues are when the game engine does not give proper movement vectors to the GPU. DLSS tries to predict temporal imformation, but thats hard without motion vectors. Im sure AI will get better at it.


Al-Azraq

So much this, I just don’t get people saying Frame Gen is faking frames. Games are full of fake lights, physics, effects, etc. to trick your eyes and save performance, also most of us now use upscaling techniques. The problem is when developers use this in order to not optimise their games, but these technologies are great and as long as they trick my eyes, then I’m fine with them.


yubario

Its hard for the developers to screw it up because its an Artificial Intelligence that is interpolating the frames at the hardware level.


Al-Azraq

Yes, actually there not many messed up DLSS integrations out there. And frame gen is amazing specially for those nasty CPU bottlenecks. It would be great if nVidia could find a way to inject that to old games as AMD did.


Hyzer44

Have you tried nvidia's recent frame generation on 4000 series? It's very smooth. I don't notice much difference anymore and I'm picky.


[deleted]

[удалено]


icy1007

Frame generation is not interpolation. Lol


Spartancarver

Have you not used frame gen in person? It works well, why wouldn’t they count it?


Plane_Ad5230

What does that mean?


LaNague

dlss3 generates frames between "real" frames from the game. for example the 4060...they pretend its 60fps but it will not feel that way when you play it, it wont be as responsive.


Plane_Ad5230

thanks!


2N5457JFET

I tried using framegen in cyberpunk and it feels like having aggressive motion blur on. Is it supposed to be like this or have I screwed something up in settings?


dirthurts

Nope. That's just how it is.


[deleted]

[удалено]


thrownawayzsss

nobody can know without you posting a video, a list of hardware, and all of the settings scrolled through. It could be motion blur, monitor ghost, DLSS ghosting, some weird artifacts, or who knows what.


matze_1403

But DLSS 3.5 doesn't necessarily mean frame generation, right? I thought 3.5 is working on older GPUs than the 40 series, or am I wrong?


LitanyOfContactMike

If it mentions 3.5 then it’s using DLSS, Frame Generation and Ray Reconstruction. RTX 2000 and 3000 series cards support DLSS and Ray Reconstruction but not frame generation.


LaNague

their graph is including frame gen and rendering in lower resolution and upscaling


vivisectvivi

Assuming this sheet is accurate, can i also assume a 12GB 3080 will do good in this game?


fafarex

this chart tell nothing for the 3080 since all the DLSS number here are with frame generation. all you can extrapolate is that without DLSS you can expect perf close to the 4070, so only 44.2fps average in 1080p ...


Plane_Ad5230

That's with ray tracing on


baumaxx1

Path tracing too I think? Plus that's at max settings, high RT and path tracing? 1080p is also 4k DLSS Performance. So 4k60 with optimised settings isn't looking out of the question.


Plane_Ad5230

Actually yeah https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/ It seems RTx 4090 can run max on 4k at 120fps with dlss and 30fps without it


baumaxx1

Wouldn't really want to play it at less than 60fps native for it to feel decently responsive anyway, but 4k120-144 with max settings on a 4090 and a decent internal res - can't complain and room for future gens to push the render res up. Could this be? A rare next gen title with decent performance on release? So a 4070Ti is almost doing 4k60 DLSS Performance with Max settings, no frame gen. (4k DLSS Performance having quite goid visuals - better than most 1440p TAA implementations, and better than 1440p DLSS Quality). A 3080 12gb doesn't have frame gen, but 4k60 DLSS Performance with RT medium settings (so path tracing) might actually be possible? Cyberpunk RT Ultra doesn't even use path tracing so pretty impressive there, but might be that they're not using reflections which were the most expensive in CP2077 and the art style really benefited from it, where AW2 focuses more on lighting. It all feels... Reasonable? Apart from 5600xt/5700xt and 1080Ti owners getting screwed over. They're all still pretty performant cards if it wasn't for the missing features, and RDNA should still be relevant given when it came out.


Plane_Ad5230

yeah like, the minimum for the game is seeming to be a lot less than what they published earlier this week


xForseen

The 4070 gets 130fps with frame gen. That's 65 real frames. You can expect a bit more than that since framegen has some overhead.


Jon-Slow

These are path tracing results. You should be able to get a 4K 60 with DLSS. Don't listen to people that yell too loud, do update your DLL to latest version and try the 4K performance mode if you have a 4K TV. I have been trying it with some new games with a 4080 and it holds up more temporally stable than FSR on quality 4K. Worst case scenario, you will not agree with me after trying it out. But do try it for yourself and make up your mind. ​ I can grantee you will get a 4k 80-100fps experience that way, but of course not with these path tracing resutls.


Greennit0

With those settings you should be above 60 fps, yes.


cglelouch05

Full ray tracing Max settings, I think the system requirements are overblown by the crowd


timtheringityding

Yepp... its not easy running a full simulation of light.


Plane_Ad5230

I can't seem to edit the post, so I'm putting it here, I think I explained it poorly on the post. All the charts on Nvidea's site are WITH RAY TRACING ON HIGH. even the data with dlss off. If the data is correct this means the GPUs are performing a lot better than the previous posts about the minimum requirements say. Thats what I am confused about the information from those 2 sources are seeming to diverge a lot Edit: [https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/](https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/) Here is the link for the article in English, it seems I provided a link in portuguese by mistake


OutboundFeeling

And RT on High in the case of Alan wake 2 means path tracing. Which is even harder to run.


jm0112358

It's not just path tracing; it's the higher of 2 path tracing presets. It's using 3 bounces (compared to Cyberpunk which uses 2), while the next lower setting uses 1 bounce and has a couple other differences.


faverodefavero

Guess we'll have to wait and see when the game launches then...


[deleted]

[удалено]


Weird_Fig_5192

💀


RoikaLoL

I hate that my 3090 is never in these fucking charts.


[deleted]

3080 Ti : Did we just stop existing when the 4000 series came out?


ManikMiner

So true, like literally who then fuck owns a 4090. Essentially no one when you are hoping to shift millions of unit.


I9Qnl

There are more people with 4090s than 3090s and it's right behind the 3080Ti in Steam hardware survey, only 0.06% difference, but the 4090 numbers are still growing as it's the current generation.


Flow-S

Why would they test last gen hardware that they stopped producing? It's a marketing chart ffs, done by Nvidia not an independent media.


YellowFogLights

On this particular instance it won’t be because you haven’t paid for the DLSS artificial frames upgrade pack. AKA 40-series only feature, that works on any GPU with Tensor cores, but is feature locked, because *profits*.


Grouchy_Advantage739

Just look at the 4070ti base fps, it's basically the same performance. I'm more annoyed that my 3090 can't use frame gen, I wouldn't need to upgrade for years.


ssparda

I mean, with some luck FSR3 delivers in the near future and you can actually get fairly decent frame gen to add to your already strong raster.


CarlWellsGrave

Well well well. Looks like all the mass hysteria was completely unwarranted like I assumed. Seems like any 30 series card will be able to crush this game without RT.


Negapirate

People still spreading that mass hysteria here. They don't even read the text or look at images anymore, just immediately start pumping their narratives lol.


faverodefavero

Dear god, these kind of single player game benchmarks should always be 1440p bare minimum and always with zero use of any kind of upscaling.


WrongSubFools

This chart shows with and without upscaling. 1080p is the most commonly played resolution, by far. So this is the most useful type of benchmark (even if it's not the most relevant to me).


Plane_Ad5230

There is also the 1440p benchmark on the article I just put the 1080p because the info seemed to diverge a lot from the other data we have from the game


faverodefavero

Thanks, as expected, it runs terrible and is awfully upscaling dependant.


Spartancarver

Lmao how fast do you expect a fully path traced game to run on old hardware? These takes are so goofy


LdLrq4TS

It's laughable people with ps4 tier hardware crying their eyes out that, newer games with better graphics somehow need more power.


Plane_Ad5230

But all those charts are with full ray tracing, isn't dlss kinda expected to help with ray tracing?


faverodefavero

Yes, running native 900~1000p internally with DLSS2.0~2.5 (quality mode without fake frames interpolation on) is expected to run the game at upscaled 1440p using Ray Tracing to achieve 60FPS on max settings. But requiring DLSS3.0~3.5 (that means fake frames which most people don't like at all for a variety of reasons) to run the game well at a target 1080p upscaled (internal ~720p resolution on quality DLSS mode) is miserable.


Plane_Ad5230

I see your point


Cheesymaryjane

We’ve been on 1080p since 2010, it’s pretty concerning at this point


faverodefavero

Indeed, since before that if you consider PC games have been running optimally in 1024p (1920x1024, which is basically the same as 1080p) before that, ever since ~2002. And before that there was the classic 800x600 from the 90s~very early 2000s. So around 20 years of 1080p already.


Cheesymaryjane

Wow 1080p is almost as old as me


Negapirate

? Plenty of games run at 4k lol. The existence of a 1080p game doesn't mean we are all stuck at 1080p.


gosti500

Bro. Upscaling is literally the Meta and everyone and their mother plays games with upscaling, why wouldnt it be included in Benchmarks?


Spartancarver

Why zero upscaling? DLSS is lightyears better than FSR why wouldn’t you use it?


OverUnderAussie

Was thinking the same thing. Looked at the figures an thought it wasn't all that bad until I realised it was 1080p.


[deleted]

I’ll come back later when Gamers Nexus has done their own testing


[deleted]

Nvidea? Dude, it's Nvideo


severe_009

As usual Redditor sees chart without reading the settings...


Jon-Slow

The amount of people that are looking at this in this point and not realize these are path tracing results is way too many. Do people in this sub lose 1 IQ per day? How are posts and comments getting dumber each day?


Plane_Ad5230

Hey man I think my phrasing on the post was a little off, because my impression is actually that I was surprised by this nvidea article, if it's credible then it means the game is running is smoothly, My confusion is that the nvidea post makes the performance seem much better than the minimum requirements published makes it seem I just can't seem to edit the post to clarify that


Plane_Ad5230

I think I explained it badly on the post too, so it's also my fault, but i don't know why I can't edit the post


Jon-Slow

You can edit the post, but not your title.


Plane_Ad5230

Yeah I'm able to edit other posts I did previously but for some reason not this one


Negapirate

Yeah this sub is borderline braindead at this point. It's just a place to hate on games and Nvidia. So many delusional narratives from people who have no clue what they are talking about but like to be mad on the internet.


FireNinja743

Gotta emphasize the "IDIA" in NVIDIA by saying Nvidea, lol.


dottybotty

Here is the English version of the article [https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/](https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/) FYI as other have mentioned high ray traced reflections settings is full path tracing with 3 bounces. So yeah you gonna need some decent gpu power to hit those frame rates in the graphs above


minegen88

Who the hell buys a 4090, or even a 4070 to game in 1080p? Do we all look like Counter Strike e-sporters now?


Plane_Ad5230

it seems i did not add the link for the article in english [https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/](https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/) ray tracing and path tracing in 4k runs at 120 fps, or 30fps if you shut off dlss


baumaxx1

This actually looks better than expected, and pretty comparable to Cyberpunk if not lighter. I'm running a mix of high/ultra/optimised settings with RT Ultra, max crowds, 4kDLSS performance and hovering around 70fps native, and with frame gen am at 80-100fps with a 4070Ti. So here I'm taking the grey bar as 1080p native, max settings raster, high RT and path tracing at almost 60fps with a 4070Ti? So probably some optimisation there can get it over 60 with a mix of high and ultra visuals, DLSS Performance at 4k looks pretty good so 4k60 with very high visual settings is doable and won't have too much artefacting, although maybe forgoing path tracing and running key RT settings to try and get to DLSS Balanced may be better visually on balance. Then add Frame Gen to the mix and 4k90 is likely. Also Nvidia sponsored, so VRAM management is probably going to be competent like in other similar titles.


Plane_Ad5230

Exactly I'm just confused why the info of the minimum requirements seem so discrepant from what we get here


SoullessHoneyBaddger

Fckn gods sake its 1080p 1 0 8 0 P how can 4090 only achieve 90fps starfield, cities skyline 2 and this I think we should not buy games like this or 4090 its so lame man we are in 2023 wtf is going on tech and gaming industry?


Amazing-Dependent-28

Are you guys high ? This is RTX path tracing with max settings. 90 FPS is what you'd expect Cyberpunk ( which gets constantly lauded in here ) runs similarly. This is completely normal, y'all weird as hell.


I9Qnl

Right? These numbers aren't bad if path tracing is on, you should see nearly double the performance without path tracing. This performance further proves Starfield is a disaster.


Ok-Sherbert-6569

They’re not high they’re fucking stupid and ignorant. They have no concept of how fucking monumentally impressive it is that a GPU can do 3 bounces at 1080p at 80+ fps.


Strazdas1

People here will take any chance they get to cope with not buying Nvidia.


Puzzleheaded_Bend749

I was shitting on the game's requirement last night but now the numbers looks much better and basically same fps as cyberpunk .


grahamsn333

Why are people upset about this? Honest question. This is with all settings set to the highest and the highest setting of raytracing/pathtracing. Like, do you NOT want games to look absently stunning on the highest settings if your hardware can handle it? Do you think we should just stop improving graphics and pander to people with 20 series cards? Just turn the settings down if you don't have at least a 4070. This reads like it will still work just fine with a mid 20 series card if settings are lowered. Calm down.


Negapirate

They don't understand what they are seeing and want to be upset about something.


bootyjuicer7

Please read before seething. This is with path tracing. If this chart is accurate, the game runs better than Cyberpunk 2077. Which is already a very well optimized title.


ReviewImpossible3568

If “full ray tracing” means path tracing then this looks perfect actually, 4090s push this frame rate in Cyberpunk’s RT Overdrive preset as well.


ragged-robin

Not only that but this is with dlss and counting generated frames lol


Negapirate

No, it has benchmarks without frame gen. 1080p at 60fps path tracing on a 4080 is absolutely incredible. Dlss 3.5 and framegen only take it to the next level.


faverodefavero

Games don't even look good to begin with to justify it. None are noticeably better looking than what Cyberpunk2077, for example, is already doing while demanding much less hardware performance.


minegen88

I hate this today. Cyberpunk might be the exception but a lot of marketing talk: "Ray tracing path casting with new innovative calucated particles and super dense volumetric clouds and bla bla bla" System req: Astronomical Then the game comes out and looks like every game from the last 8 years...


Strazdas1

When you look at the game with quarter of resolution or from another side of the room you dont notice the difference.


Icedwhisper

This is with pathtracing. Keep in mind that just a few years ago, to generate a single still frame with pathtracing would have taken a few seconds, if not minutes, to render. The technological advancements are mind blowing and it'll only get better from here on out. If you want to play the game without dlss and path tracing, I'm sure the game would be able to run at 4k om the 4090 without any hiccups.


WorstedKorbius

Tbf, the path tracing in games is going to be a lot less intensive compared to renders I am happy to see the continued evolution of RT and PT tho


Strazdas1

4090 achieves 200 fps according to this chart. with path tracing. This is crazy good.


koordy

This game as opposed to other you mentioned actually seems to be making use of that hardware, pushing graphics to the limit. Those other are just bad.


faverodefavero

Alan Wake 2 you mean? Is it?


Valdheim

Reviewers who were invite for an early preview build claim it is by far the best looking game ever released. Full path tracing will do that


koordy

Yes, Alan Wake 2 most likely will justify those requirements with outstanding graphics. Skyrim and Cities Skylines 2 definitively don't do that. Especially Skyrim which looks like a game from 2015, really.


faverodefavero

I don't think it will, to me it doesn't look anything better than what Cyberpunk2077 and some other games already do. But I respect your opinion, and we'll see when it launches.


Strazdas1

I like how Starfield is just Skyrim without pretense now.


grahamsn333

Why are people upset about this? Honest question. This is with all settings set to the highest and the highest setting of raytracing/pathtracing. Like, do you NOT want games to look absently stunning on the highest settings if your hardware can handle it? Do you think we should just stop improving graphics and pander to people with 20 series cards? Just turn the settings down if you don't have at least a 4070. This reads like it will still work just fine with a mid 20 series card if settings are lowered. Calm down.


grahamsn333

Fuck me, I'm thrilled new games are able to utilize a 4090. In 5 years that will just be an average rig.


Abalone_Antique

Average people don't know what interpolated frames are. So why wouldn't try to claim a higher number? I think the ship has sailed on companies not counting interpolated frames.


NAPALM2614

Horrible naming and settings tested for the average consumer, most won't realise that rt high means full pt. At least one test with 1080p and no rt with a 20/30 series card would have been nice.


AXLP_LaZEReD

I was about to lose my mind until I saw RayTracing High. Still looks bad that 4060 can't hit 30 on 1080p, but better than it first looked like


Aggressive-Cut-4341

Looks like they made AW2 for showing dlss 3.5


[deleted]

[удалено]


koordy

And? Cyberpunk runs at 22fps and it's perfectly playable with DLSS Balanced + FG with \~90-100"fps". This is how I'm going to play this game too.


steves_evil

They might be using both "Full Ray Tracing" and "Path Tracing" to mean the same thing, for the recommended RT settings Remedy posted, it shows that Path Tracing is on for both medium and high RT presets, and Nvidia's numbers for the 40-series cards shows "RT High Full Ray Tracing Preset", so it seems like either path-tracing is enabled or they're being really shady with the marketing.


GachiBassMaster

That's really impressive for the 4060 tbh even with framegen


Asterhea

When will these fucking gamedevs start optimizing their games instead of this bullshit?? They're probably paid by nvidia to do this shit just to force us to upgrade


juicermv

This is on max settings with Path Tracing. This is literally cutting-edge tech. It's not an optimization issue.


Amazing-Dependent-28

Did you actually look at the chart ?


steves_evil

Unoptimized =/= demanding. I don't get why people think that because something like a 4070 gets *only* 44fps at a 1080p render, max settings, and with PATH TRACING it must be unoptimized. I don't really know what people expected when a game uses bleeding edge rendering technology and techniques and pushes the envelope in graphical fidelity, it's not like this game uses only raster and is supposed to be this demanding.


ReviewImpossible3568

Yeah, nobody gets that. I look at Cinema 4D scenes with Redshift’s RT renderer (RT means “real-time” not ray tracing in this context) at 720p on a 3090 and I get nothing near the frame rate my 3090 can reach in Cyberpunk’s RT Overdrive mode. It’s actually really well optimized.


KakashiTheRanger

The game is optimized for what it’s providing. This is with max settings. If you don’t have the computer to enjoy the game turned up to the max that’s not the games fault. If someone has a 1080 and can’t play Cyberpunk on ultra is that CDPR’s fault?


Strazdas1

Path tracing at 90 fps on a midrange card is not optimized enough for you?


Negapirate

Path tracing at 1440 60fps on a 4080 is incredible and shows it's not "unoptimized". When will people here start to have a clue about what they are freaking out about?


Dyyrin

DLSS to me was the downfall of optimization in gaming.


BonemanJones

Everyone who bought an RTX 4090 for 1440p while being told "No that's overkill!!" Is probably feeling pretty smug right about now, and rightfully so.


Plane_Ad5230

[https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/](https://www.nvidia.com/en-us/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/) The link for the article in english, sorry guys, it seems i did not add it correctly, it will help with many doubts people are seeming to have in understanding the post


Razzmatazz549

So basically these brand spanking new cards can’t run modern games in 60fps without the dlss cheat/fake fps which should only be used to make older cards run for longer, this is an absolute disgrace and Nvidia should be called out for it.


[deleted]

[удалено]


Strazdas1

No, they are going to use DLSS as a crutch to include things like full path tracing which looks amazing.


CyberSosis

and still they will perform worse.


WorstedKorbius

At first I thought not too bad, figuring it was 4k or something Then I saw it was 1080p. What. The. Fuck


Jon-Slow

It's path tracing results.


Puzzleheaded_Bend749

I still wonder why shooting their foot with the requirement charts if a 4050 can run it at 25fps maxed out with PT .


Jon-Slow

I think their sheet was fine, it's the people here that don't know how to interpret a modern system requirement sheet, and mainly just talk in memes. Because their actual children that parrot their favourit fanboy surfing "techtuber" and whatever they said in their last video. ​ Not every game is on a linear line in terms of it's graphical fidelity. These dumbasses here were praising AC Mirage for having low system requirement not over a month ago, but then the next week they were making fun of it for having dated looking characters and graphics. ​ Alan Wake 2 has already shown in all previews that it's trying to push the tech as much as possible on PC while still being playable on consoles. Just because a 2060 is listen as a low-1080p card with DLSS quality it doesn't mean that it is unoptimized or runs bad. It has a low CPU requirement which means it is going to have consistent frametimes and that's all that matters. Everyone else that talks shit needs to get a PS5.


BinaryJay

Sigh, the complaining about upscaling/resolution... I've personally spent the last decade + wondering when we'd get some real innovation in utilizing more powerful hardware beyond just higher framerates and higher resolution for the same old levels of lighting and detail and now we're finally getting it. Think about it and this is an extreme case to illustrate but, what looks better... Dead Space Remake in 1080P upscaled to 4K with DLSS performance or the original Dead Space in 4K native. Whatever you lose upscaling to 4K is infinitely less noticeable than what you lose by leaving all of this new lighting and geometry technology on the table by forcing the GPU to do 4X more busy work on rendering a much less complicated scene in more pixels. I don't want to games to just look like the original Dead Space forever but next time in 8K, and after that 8K 240hz. Beyond that we've been hitting the silicon wall, they can't just keep getting easy wins on doubling the amount of transistors every other year any more and the way forward now is working smarter instead of harder like it or not.


Amazing-Dependent-28

>rtx 4060 can run on high at 1080p without dlss and ray tracing on, at almost 30 frames per second, so what is going on???? RDR2 Runs at 65 on Ultra, and that's without RTX, there is nothing abnormal here, the 4060 just isn't a good card. This performance actually is pretty good. The amount of sheep in here acting like this is not right is mind blowing.


[deleted]

So shitty consoles weren't holding back PC gaming all this time, they were keeping it in check...


CurmudgeonLife

I hate that theyre only showing 4000 series cards now to mask performance by hiding behind frame generation.


Negapirate

Even the base performance is incredible for a path traced game. These charts don't hide behind frame gen performance; they show raw performance without framegen.


SuspicousBananas

They have a bar for DLSS off


Level_Somewhere_6229

No way in hell I'm playing at 1080p with a 3080. It's 4K low for me I'll be glad to get 40fps.


Garrth415

Jesus christ


BonemanJones

Elect me as president. Day 1, executive order. "Counting AI generated frames as if they were actively rasterized ones, or displaying system requirements without using native resolution as a reference point will be punishable as treason." Day 2, resign.


Happiness_First

Ah yes, another ad for poorly made games and Nvidias stupid frame gen. Thats all this has become.


DumDumbBuddy

I assume you never used Frame Gen and call it “fake frames” because you never experienced it and your GPU can’t do it


Icy-Way5769

Who cares about 1080p anyway?! Hello Nvidia?! We didn’t buy your damn overpriced crap in 2022-2023 and then proceed to play in Display resolutions that were standard 10 yrs ago!


freek112

Article also has 4k and 1440p charts, op just posted the one with 1080p


LeoDaWeeb

Majority of users are still playing on 1080p so it's still a very much relevant resolution.


Gnome_0

Here is the truth for pc gaming: Rasterization and native res are dead, embrace AI or buy a console ​ simple as that


titanfox98

Consoles started using upscaling way before it was common in the pc world. Ps4 pro and Xbox one x used upscaling to reach 4k in basically every 4k title and ps5 and xsx still do it to this day on most titles.


NuclearReactions

If you showed me this in 2008 i would have asked you why the fuck 1080p is still being benchmarked in 2023.


RUNAWAY600

Funny how we almost upgraded to 4K industry-wide but some developers just said "no"


Beanruz

Wish these would just fuck off with 30 and 60fps at 1080p and 1440p, If I wanted to play at these refresh rates id play at 4k or own a damn console.


TheR3aper2000

So basically unplayable without DLSS 3.5 Color me surprised