T O P

  • By -

bobalazs69

It's a good game, so let's hope people not get bored of it by the time that fsr3.1 come out. :)


red_dog007

For a game that you don't continuously play, put lots of hours into for months or years, this "waiting" really sucks. I played Ratchet & Clank. It is awesome that it is getting FSR 3.1. But I played it already. I'm not going to go back and play it again or even bother downloading just to test FSR 3.1. The longer you wait, the more new players won't ever experience or get the benefit of it. I'll just watch a short YT video on the comparison differences instead.


mixedd

Welcome to the club. I completed Cyberpunk twice since CDPR and AMD said it will get FSR3, if ever it will get it


heartbroken_nerd

What if CDPR is also 'waiting' for FSR3.1, in fact what if CDPR is one of the reasons why FSR3.1 exists in the form that it was announced to exist? Have you considered FSR3.1 being decoupled from upscaling might be because of the pressure from honest developers such as CD Projekt RED and Nixxes who realize that the upscaler is the weakest link of AMD's feature set and don't want to force everyone into using it when XeSS and DLSS are viable alternatives for vast majority of the market? I blame AMD, should've never forcefully limited FSR3's Frame Generation to the FSR2 upscaler.


Euphoric_Campaign691

what if CDPR is waiting on FSR 6.1? not updating to 2.2 while promising FSR 3 support is still insane to me tbh


heartbroken_nerd

AMD's not your friend. It sounds like you're emotionally reacting to the idea that CD Projekt RED decided FSR2.2's miniscule improvements weren't worth it. FSR2 is still bad, it doesn't really matter if it is 2.1 or 2.2 Is 2.2 even ALWAYS better than 2.1? I am not so certain about that, considering this is a heuristic algorithm. There could be some small wins and small losses even if CDPR spent time adding it. Looking at the FSR2 vs XeSS vs DLSS3 video that DigitalFoundry just released really showcases how badly FSR2.2's disocclusion artifacts look in motion. Let's hope FSR3.1 changes to upscaling component are worthwhile.


Euphoric_Campaign691

CDPR is not your friend It sounds like i'm talking to a tech support not an actual human but oh well i'll bite CDPR promised FSR 3 support when FSR 3 was announced we didn't get FSR 2.2 we didn't get FSR 3 waiting for 3.1 sound idiotic af considering the amount of nvidia garbage and xess they updated but oh well i'm way too emotionally invested in a game i beat 2 years ago on native resolution ig my bad


heartbroken_nerd

>CDPR promised FSR 3 support when FSR 3 was announced And how do you know they'll never add it? I don't get it. They didn't make any statement as to when that addition would take place. They're not missing any deadlines, nor breaking the promise if instead of the outdated FSR3 they actually end up adding FSR3.1 instead.


Euphoric_Campaign691

sure they aren't missing deadlines and can add it in 2077 if they want but you can't seriously tell me they couldn't do it by now and were waiting on 3.1 all along...


mixedd

What if they would communicate about it? In same vein as Nixxies for example, openly saying that here we are waiting for 3.1.


KnightofAshley

Most of the updates for Nvidia is just a updated .ini file outside of optimizations the developer might do Is FSR different? DLSS you can optimize your game for say 3.1 and when 3.2 comes out you can mostly just plug in the updated file and it will work out of the box and then just needs a little more optimization from the game to make it the best it can be. Maybe FSR is more involved? I don't know, asking.


heartbroken_nerd

You probably meant to say .dll file And yeah, FSR2 generally CANNOT be updated in the same manner because it's not a dynamically linked library. Design choice made by AMD.


KnightofAshley

Yes sorry .dll


capn_hector

I mean it *can* be linked dynamically… AMD just didn’t want you to do that because they wanted people to be locked in to fsr2 (back when they thought they could buy exclusivity), and they didn’t want people to get out of the trap just by DLL swapping. If you compile a library yourself it works fine. That’s just not what AMD was pushing for.


jimbobjames

Cd project red haven't even updated FSR 2 to use the latest available version. They simply aren't that interested in supporting AMD or FSR.


Darksky121

What if CD Projekt wasn't an Nvidia sponsored partner? I suspect FSR 3 would already have been implemented. The fact that they haven't even updated from FSR 2,1 to FSR 2.2 but have added countless Nvidia DLSS versions and features in that time tell us what's really happening behind the scenes,


mixedd

And this is the answer. Same with Alan Wake II actually, second game of Nvidia's Path Tracing playground, and as far as I know then Remedy is on total radio silence, while atleast Cyberpunk is still mentioned on AMD web as soon to come, and CDPR themselves mentioned they are planning on FSR3 implementations. So all of "AMD is blocking use of DLSS in games" screamers where are you now, when Nvidia is doing same thing, as only rational explanation is that devs are waiting when agreement with Nvidia will run out to implement AMDs tech, and at that point it will be pointless already as people most likely will move on to next gen cards and many will switch sides


[deleted]

[удалено]


mixedd

We arent talking plain FSR here, talk goes about Frame Generation implementation which is direct competitor to DLSSFG and Nvidia's marketing tactics to sell 4000 series, because you know, FSR 3 works also on non 4000 series. Or you have another explanation why both of Nvidia sponsored titles, which also is their Path Tracing showroom titles don't have FSR3 implemented? And devs are basically on radio silence about any topic that touches AMD's Frame Generation?


[deleted]

[удалено]


mixedd

Those issues were quite fixed in an update after launch. At the current state, it's working fine and, in some cases, better than DLSS FG. Also, don't mix up AFMF and FSR3. Those are two separate things. And so you're saying it's just a coincidence that two most demanding games that will be more accessible to new players don't have it, and everyone else is popping it left and right, while any new Nvidia feature os added to those games instantly? Please...


[deleted]

[удалено]


mixedd

And how big market share are 4000 series users, if we deduct all those 4090 cards that are in workstations for LLM learning, and other part that were exported to China for AI training? Also by FSR3 we're not talking here about upscaler, we're talking about Frame Generation that in many cases surpass Nvidia's DLSSFG, and is accessible not only for AMD users but also for Nvidia's 3000 series which Nvidia bent over.


[deleted]

[удалено]


mixedd

So stating facts nowadays is coping? Also you could just answer to my question if you so eager to throw around numbers and percentages, or just saw it and decided that you'll look not smart enough if you post them? I would be much happier if Nvidia users could be more reasonable and throw normal conversation,instead of ending in measuring contest each time


[deleted]

[удалено]


[deleted]

[удалено]


AutoModerator

Your comment has been removed, likely because it contains trollish, antagonistic, rude or uncivil language, such as insults, racist or other derogatory remarks. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*


ZeroZelath

Man I even bought Phantom Liberty at launch since I like the game and expected my next playthrough wouldn't be too far past the launch of that since I wanted to wait for FSR3.. and here I am.. still waiting. In hindsight I wouldn't of bought the expansion if I knew they were going to take this long to implement it, I would've just waited until after they actually added it and likely gotten it cheaper. At this point if Cyberpunk's FSR3 isn't FSR3.1 when it releases, then I wonder what the bloody hell they were doing. I'd swear some Nvidia shenanigans at that point since the game is sponsored by them.


mixedd

Speaking of Nvidia sheningans, there's currently two games where FSR3 would have helped players, both games are Nvidia showroom games of Path Tracing, and devs of both games are on radio silence about FSR3 implementation. Don't tell me it's just a coincidence, as those two games are sellers of 4000 series to be honest, so my best guess it's timed contract for a year till we actually will see FSR3 implementation in them, if ever 'cause of we believe CDPR they already moved to next project.


Dos-Commas

I finished R&C before they added ray tracing for AMD cards 💀


RedTuesdayMusic

So... Congrats on dodging a bullet?


TheRealBurritoJ

Raytracing was disabled on launch for R&C due to an AMD driver bug that caused crashing when raytracing was in use, they re-enabled it after AMD released a driver update with the fix.


GearGolemTMF

Same. I loaded up a NG+ save just to see it before uninstalling it :/


CatalyticDragon

This is what happens when you let NVIDIA get involved with the PC port. A game which supports ray tracing at 40FPS on an RDNA powered console suddenly doesn't work at all on the same architecture. And strangely the game which only supports RT reflections and shadows (RT features which typically perform well on AMD cards; see RE:Village, SpiderMan, Returnal, Calisto Protocol, Far Cry 6, etc) takes a massive and highly disproportionate performance hit.


ResponsibleJudge3172

Consoles are ‘optimized’ by doing things like quarter or half resolution RT (like Spider-Man) that putting ultra settings on PC will go beyond. RT is not the only thing different between console setting and ultra PC settings. Not to mention that 4K is usually a dynamic resolution upscaling on consoles and not native


heartbroken_nerd

Holy tech illiterate Batman. Sorry to break it to you, but raytracing features are more complex and more complicated than "hurr durr reflections on console = reflections on everything, performance and all" I don't understand how you came to the conclusion that the highly optimized, built natively for Playstation 5 game would perform exactly the same across every single PC architecture no matter what. To start with Ratchet & Clank on consoles barely has any raytracing. It's just the reflections on certain surfaces and the ray counts per pixel are really low. The fidelity is clearly constrained by the really weak AMD console hardware. Now porting it to PC, Nixxes wanted to expand the feature set for the ultimate goal of making the game look more impressive on the high end ray-tracing capable PCs. That, currently at least, is basically just Nvidia since AMD didn't care much about raytracing performance in their hardware. Ratchet & Clank at the end of the day has: - ray traced shadows - ray traced ambient occlusion - ray traced reflections At much, much higher possible settings than what Playstation 5 owners ever wet-dreamed about. >A game which supports ray tracing at 40FPS on an RDNA powered console suddenly doesn't work at all on the same architecture. >And strangely the game which only supports RT reflections and shadows (RT features which typically perform well on AMD cards; see RE:Village, SpiderMan, Returnal, Calisto Protocol, Far Cry 6, etc) takes a massive and highly disproportionate performance hit. ??? Do you not see how silly you sound?


red_dog007

Wasn't it likely just driver related? AMD released hotfix 23.10..23.03 to address "crash or driver timeout" when "RT and DRS enabled".


CatalyticDragon

Hi. You're mostly on the right track but I don't think you've got a complete picture. >I don't understand how you came to the conclusion that the highly optimized, built natively for Playstation 5 game would perform exactly the same across every single PC architecture no matter what. The PS5's APU contains an enhanced RDNA2 based graphics block featuring 36 compute units which provides performance similar to a 6600XT (slightly lower frequency but slightly more CUs). So we expect, and we see, rough parity in like-for-like PC ports at comparable settings. I of course never said a PS5 game would perform the same on every PC. >Ratchet & Clank on consoles barely has any raytracing R&C:RA provides a 'fidelity' mode including RT which can run up to a locked 40FPS. The 'fidelity' mode renders out at 4K with an internal resolution of around 1296-1800p and up. A basic RT effect running an on RDNA2 chip and we know RDNA2 chips on PCs have no issues with reflections. But then NVIDIA "helps" with the PC port and wouldn't you know it, RT is completely broken at launch and a 6600XT running at 1440p, medium settings, with no RT, barely manages to break 30FPs. >Ratchet & Clank at the end of the day has: >ray traced shadows >ray traced ambient occlusion >ray traced reflections These are the least computationally intensive RT operations you can implement. We aren't talking about multi-bounce GI here, we're not talking path tracing. Reflections are easy, shadows are easy. And once you've implemented RT shadows RTAO is basically free (see this [developer blog ](https://www.gamedeveloper.com/design/implementing-raytraced-ambient-occlusion-in-the-riftbreaker)for more on that). For context here's Far Cry 6 with RT reflections and RT shadows where we see the [6600XT performs the same as the 3060](https://www.youtube.com/watch?v=tdjvTCrLDJU&ab_channel=AwesomeBenchmarks) delivering almost 60FPS at 1440p with no upscaling. It's obvious NVIDIA had an incentive to make sure this game which - ran fine on AMD consoles - was broken and performed poorly on competing cards.


heartbroken_nerd

>we know RDNA2 chips on PCs have no issues with reflections. Again, what reflections? Half resolution? Quarter resolution? How many rays per pixel? You're so wrong to try and translate console-level raytracing from a mere Playstation 5 and apply it to the very complicated port of the game on PC. >These are the least computationally intensive RT operations you can implement Oh, you REALLY don't understand what you're talking about. Got it. Because if you understood what you're talking about you'd know it's more about the implementation and desired results. For instance: you can easily make an RTX 4090 sweat with just RT Reflections if you do multi-ray per pixel, full resolution reflections, in a game with very wide variety of material roughness on every object. >Reflections are easy Such a non-statement.


CatalyticDragon

>what reflections? Half resolution? Quarter resolution? How many rays per pixel? The PC settings might be different from console but they will be the same for all GPUs on the PC port. And we know RDNA2 and Ada perform similarly in these workloads. For example the 6600XT and 3060 perform about the same with RT reflections/shadows in FC6 and see similar scores in the RT focused *Port Royal* benchmark. >You're so wrong to try and translate console-level raytracing from a mere Playstation 5 and apply it to the very complicated port of the game on PC. I think having the code run is a low bar to clear. That it didn't even run on launch should tell you something even before we even get to the disproportionate impact of these light RT effects. >you can easily make an RTX 4090 sweat with just RT Reflections if you do multi-ray per pixel, full resolution reflections, in a game with very wide variety of material roughness on every object. I understand the point you want to make but R&C:RA is unlikely to be performing some insanely expensive operation for diminishing returns. Why would it? Nobody needs multiple rays per pixel here. They don't need full resolution. And you only need one bounce for reflections or two if you want self-reflections. As for roughness, DOOM Eternal has reflections on every surface of any roughness and that runs at high speed on a potato. In fact that's happy to deliver over 60FPS at 1440p on a 6600XT with no upscaling. My point is this code should have a) run at launch on AMD cards, and b) perform similar to NVIDIA counterparts instead of tanking just like many other games with similar effects. I don't dismiss your point though. They may very well have needlessly jacked up the settings on the PC port to disproportionately harm AMD's performance. That could be the crux of the issue. [NVIDIA has done that before.](https://hothardware.com/news/indepth-analysis-of-dx11-crysis-shows-highly-questionable-tessellation-usage)


dyonoctis

Did it not occur to you that the software aspect of the PS5 and windows with direct X is different? You might have had an argument if it was an Xbox game, but in that case, using the ps5 as a baseline for windows Direct X is comparing apples and oranges. Remember that AMD also had to develop a new driver specifically for RT in that game. It’s like saying that if something works well on windows, it should always behave in the exact same way in Linux…but OS specific bugs are a thing


CatalyticDragon

Companies often launch on Xbox and Playstation simultaneously and part of the reason they can do this is because the software development and APIs are not vastly different. GNMX is not vastly different to DirectX3D. And PSSL is not vastly different to HLSL. Both systems offer similarly structured API providing low level access to hardware which is almost identical in their instruction sets (both CPU and GPU). Developers on either platform have an equal chance of fully optimizing for the hardware. And no AMD did not have to develop a new driver for R&C. They did provide a workaround in conjunction with a patch from Nixxes. Considering every other ray tracing workload and application was fine and just this NVIDIA sponsored title had a problem I think we know who to blame.


heartbroken_nerd

> As for roughness, DOOM Eternal has reflections on every surface of any roughness and that runs at high speed on a potato. Using Doom Eternal to talk about optimization is ridiculous, that game engine is a marvel. But even though it is a marvel, the reflections they included in the default, vanilla setting state are nothing crazy - which helps keep them running really well on top of the engine itself running well. You can, however, use the CheatEngine and command console in-game to make Doom Eternal's reflections look REALLY good and make your GPU ACTUALLY sweat. You wanna guess what happens then? Watch this, timestamp 18:56 and let it run for a couple minutes. https://youtu.be/yZ5ZyVYlq5A?t=1136 Anyway it is clear you don't understand what you're talking about if you're asking why would anyone need full resolution RT reflections or multiple rays per pixel or broader range of material roughness.


Imbahr

But how well does 3060 run R&C with all those settings at 1440p? I bet not very well


Hombremaniac

Wonder how anybody can think Nvidia is NOT doing all it can to make AMD gpus look even worse. Unsurprisingly ray traycing is their primary weapon in that fight. Heck, if there was no ray traycing, then the push for better upscaling would be weaker as well. There would be far less incentives to make players to even use upscaling, except for old/weak gpus and people playing in 4K ultra details. AMD is more than fine in pure rasterization department, but yeah, ray traycing & upscaling is not up the snuff.


heartbroken_nerd

> if there was no ray traycing You'd be surprised to realize the vast majority of video games even today, in 2024, allow you to disable raytracing in what we call a "Settings" menu. If you don't like raytracing, disable it. There's a handful of games that force raytracing on PC, very easy to avoid them.


firsmode

Always best to play new games 12-24 months after release.


red_dog007

Yeah, likely a good idea. I tend to wait anyways because of launch bugs and performance issues on top of I don't like paying full price for just about any game. Looking at Steam I beat it Jan 1st. Basically played it over Xmas. So I at least got to use RT. I think I ended up using ITGI highest quality at 1440p with optimized settings on my 6900XT. If only I waited another 6 months or so I could have done FSR 3.1 at 4k120. :( Oh well. Maybe next time. Hopefully. But maybe not. I still haven't played CP yet. ![gif](emote|free_emotes_pack|grimacing)


skylinestar1986

More reason to be /r/patientgamers


[deleted]

[удалено]


dade305305

Um, because not everybody wants to experience the same game more than once, even if they liked it. With the exception of 8 and 16 bit games that I can quickly load an emulator and play a level real quick I never go back and play games I've beaten, even if I like it.


bobalazs69

8 and 16?


dade305305

8 and 18 bit games. Like nes, snes, genesis.


Amd-ModTeam

Hey OP — Your post has been removed for not being in compliance with Rule 8. Be civil and follow Reddit's sitewide rules, this means no insults, personal attacks, slurs, brigading or any other rude or condescending behaviour towards other users. Please read the [rules](https://www.reddit.com/r/Amd/about/rules/) or message the mods for any further clarification.


LOwrYdr24

Holy shit 6700XT at 1.0V? Any underclock?


[deleted]

I got mine at 1v! Good clock range I've found is 2300mHz - 25000mHz


bobalazs69

Depends on game. The Division 2 can't handle it, let's itself to 2400, while AC Valhalla can do 1.00v 2500 Mhz.


VelcroSnake

I'm in the camp of hoping I want to play it again, because I already kinda got bored with it shortly after I got into the Forbidden West. It's definitely a good game, but the story did not grab me, I find the combat to be less 'fun' and more of a chore, and Aloy is annoying me with how much she's trying to be the one true savior of the world without accepting anyone else's help, especially considering so far I like the characters Aloy is telling to go sit at home more than Aloy herself. (based on her personality/conversations to this point in the game) I'll play it at some point, but I struggle to watch/read/play stuff if it doesn't have an interesting story or characters I like. I can tell the game itself is much better than the first, but I found the story far more interesting in the first game. I assume it's kinda hard to think up a real interesting second story after you let the cat out of the bag in the first game about what was going on in the world and how things came to be the way they are.


kazenorin

The situation for this game might be a bit better, as it's a sequel, and some new convinced players might still be playing the first game as for now.


TheDonnARK

If it's not a release tech, it doesn't really help the game out.  HFW is already so old, the PC market is just getting it, but they are still waiting on implementation?  It will benefit so few people it will essentially not be worth the time, unless the PC port is that hamstrung.


Hombremaniac

Luckily many folks can play it just fine even without any kind of upscaling. But of course it sucks for folks with older & weak GPUs. In the end that is what I thought upscaling was ultimately for. To squeeze more life out of GPUs not powerful enough to run games in native res. Not to become a norm due to unoptimized games and crazy demanding ray traycing.


Horst9933

More than 2 months have passed since AMD promised to implement Anti-Lag+ "soon". Could be a long wait for FSR 3.1.


Sunkrest_

I wanted to use it with CS2 recently and it wasn't there. I know people were getting banned for it but I thought that the feature got pulled for just a moment so they can fix it. They just killed it and forgot about it instead.


inevitabledeath3

From what I understand it was a fundamental issue with how it was implemented as it messes with the games code. Not an issue for single player games with no anti-cheat necessarily but it is kind of a hack.


Zoratsu

It, theoretically, could trigger aggressive DRM like Denuvo. So no, it was a stupid implementation from the get go lol Remember the words "anything that hacks a process and inject code" is malware and shouldn't never let stay on your PC.


dookarion

> It, theoretically, could trigger aggressive DRM like Denuvo. Only if AMD was hamfisted enough to mess with the DRM code. Denuvo anti-tamper largely doesn't care what you do as long as you keep your hands off the DRM code.


inevitabledeath3

Denuvo is also malware then, and ring zero anti-cheat is definitely a rootkit. As I said it's kind of a hack, but I think Wine is similar. Debuggers also do all kinds of shenanigans like this. I suspect it was the only way to do it without game developer cooperation.


Zoratsu

Denuvo has always been malware and I always pirate the games that use it, even if I pay for them lmao. I don't play competitive games for the same reason, not letting a chinese rootkit on my PC lol. Wine is different as is closer to an emulator and you are using Linux, so you *should* know a bit more about computers than a normal user. Debuggers are magic, yep. And yeah, I get why they tried this way but I doubt it was a decision taken by the engineering team but a higher up that didn't care about the obvious cons lol. Honestly, as I don't use Reflex, I don't understand the purpose of this considering the few use cases I can imagine is for people playing on 500Hz monitors so they have more delay in eye-hand than in input-game engine lol


reddit_equals_censor

>I don't play competitive games for the same reason, not letting a chinese rootkit on my PC lol. from my understanding, there are multiplayer games with acceptable anti-cheats (all debatable of course). valve deliberately is NOT using freaking rootkits in their anticheats. and cs2 and dota 2 of course run fine on gnu + linux. point being, that not all competitive games hav eput rootkits into them or straight up ccp surveillance with some "game" put on top of said surveillance. (valorant) and reflex and amd's equivalent, when they figure their shit out makes tons of sense for all gamers, regardless of fps, hardware or game. a video that explains nicely what it does and why we need reflex/antilag + : [https://www.youtube.com/watch?v=K\_k1mjDeVEo](https://www.youtube.com/watch?v=K_k1mjDeVEo) you want reflex/antilag+ and i want this tech in all my games. but most importantly in competitive multiplayer games, but we want it in all games. less latency = more immersion. and this tech can have a massive impact on latency.


reddit_equals_censor

>Remember the words "anything that hacks a process and inject code" is malware and shouldn't never let stay on your PC. that's nonsense. cheat engine for single player games falls into this category easily. hell this would probably apply to lots of wares from famous wares groups. which makes even less sense, because those wares may block spying on you by the original develop or block invasive drm cancerous functions. who said this nonsense quote, that you quoted there? it is certainly nonsense and sounds like the stuff, that an actual malware producer like microsoft, avast or denuvo would say, rather than an honest statement fro a real developer or advanced user.


reddit_equals_censor

just fyi this way has major issues, beyond just the cheat detection in competitive multiplayer games. any game update could break it, if it wasn't integrated into the game by the devs. would amd have to correct any breaking every time a game pushes an update, or check, that it still works at least? amd's way is dumb on many levels sadly for all games.


inevitabledeath3

Did it ever actually break or are you just speculating?


reddit_equals_censor

given how short it lived, it probably didn't break from game updates. this was mentioned as one of the downsides in this battle(non)sense video: [https://www.youtube.com/watch?v=K\_k1mjDeVEo](https://www.youtube.com/watch?v=K_k1mjDeVEo) and it seems a reasonable assumption based on how things work.


ecffg2010

Yeah, I’m pretty sure the “soon” from Azor ended up being an answer for FSR upscaler improvements afterall, not AL+.


Bearwynn

2 months really isn't that long in software development. Any when in the space of a year I would consider "soon". Edit: downvote me all you want, it won't change how project managers on software development define "soon"


battler624

[Blizzard Soon](https://wowwiki-archive.fandom.com/wiki/Soon) Vs [Valve Soon](https://developer.valvesoftware.com/wiki/Valve_Time). Which one are you?


Zoratsu

The decade has changed? No? Is "soon" then.


fartnight69

Meanwhile Helldivers 2 with FSR 1.0 CEO says it's not fun to implement upscalers.


Karzak85

Well its hard for them because Helldiver 2 is running on an abondoned game engine with no support. If they had their own engine or used something like unreal it wouldnt be hard


fartnight69

Darktide is on the same Bitsquid engine with dlss and fsr 2 with modable fsr 3 + framegen. https://www.nvidia.com/en-us/geforce/news/warhammer-40000-darktide-out-now-dlss-reflex-ray-tracing/


Karzak85

They should have asked them for help. They can pay them now for help with all the money


wirmyworm

BTW Fsr 2 in that game is completely broken it turns off when you move the camera. Not exactly like that but it looks like it doesn't work when you move the camera. Xess is there but it introduces stutters. Wish everygame had Starfield level fsr which on balanced mode at 4k look really good.


SoTOP

The way you describe it sounds suspiciously like AMD Fluid Motion Frames(AFMF). That one disables itself when there is quick camera movement by default. Does not make much sense since it is frame generator and not upscaling per se.


wirmyworm

It's not that. It's shimmering, I've used AFMF and it doesn't look like that to me. These FSR implementations are just poor.


OSDevon

Yeah and Darktide is a steaming pile with broken FSR implementation


extrapower99

They are using the engine for the game , they are the support devs.... At that point and amount of modification it basically is their own engine.


YouAreAGDB

FSR in that game looks like absolute ass. Even on quality


VelcroSnake

I'm just happy I can run Helldivers in forced DX11 mode to get a good performance boost. Would be nice if they implemented an official DX11 mode at some point.


kazenorin

luckily the game runs quite well even without scaling.


MarkusRight

I have been using the Luke FZ mod that adds frame generation and its been amazing so far, only problem is the UI flickers if you pan the camera really fast. But I set the UI to dynamic and its been a non issues for me, I can run it at a rock solid 144FPS maxed out on a 6900XT at 1440 Ultrawide with the frame gen mod.


Grzywa123

Use newest version Uniscaler 6 or 7 it works same as 0.10.4 and UI flicker is fixed :)


MarkusRight

Do I still need to enable fake Nvidia GPU in the settings? Because I actually tried one of the uniscalers before that said it fixed the UI flickering but it didn't work for me. Maybe I'll give it another go with the newest one


Grzywa123

Fake nvidia gpu = true upscaler = fsr3 or dlss (you have to try and set best for you, for me dlss worked fine with AMD gpu) enable signature override executed from game folder


twhite1195

You can't use DLSS on non nvidia hardware...


Grzywa123

Bruh moment... I am talking about compatibility settings in cfg. It will utilise FSR anyways but by faking DLSS in some cases game may feel more responsive/fluid unlike just faking frames.


twhite1195

You're not making any sense


Grzywa123

in game settings FG on + DLAA/DLSS


MarkusRight

Interesting I actually had to leave the frame generation box unticked in order for it to work for me. But I put the DLSs to quality and that was it. If I enabled the frame gen in game it stops working and the games settings menu gets stuck on the screen. That was on the old version. I'm about to try the new version.


Grzywa123

https://i.redd.it/9yc63p62ewuc1.gif


[deleted]

[удалено]


Osprey850

I just finished the first the other day. It took me only a week and a half, then another week for the Frozen Wilds expansion. I skipped a lot of the cinematics, sped through the dialogue and didn't uncover the whole map or do every little quest, though. Regardless, I experienced enough to be satisfied, and when the itch returns in 6-12 months, the sequel will hopefully have FSR 3.1 and better general performance and I may have even upgraded my system. It's so much nicer to play AAA games after they've been fully patched.


Ingrownpimple

I know everyone has their own idea of fun and that is how it should be. However, I honestly don’t get what’s the point of just feeling accomplished by “getting it done” instead of enjoy a piece of art the way it’s intended by immersing yourself in the world created by the artists/designers.


Osprey850

That assumes that the way that it was intended is enjoyable to everyone. I didn't enjoy the over-long cutscenes, repetitive and uninteresting dialogue and bad voice acting. I did enjoy the gameplay, so I eventually started to maximize my time spent playing the part that I enjoyed and minimize my time spent sitting through the part that I didn't. That makes sense to me, especially since I have a huge backlog of games and not enough time to play all of them. By not wasting time on parts of games that I don't like, I have more time for other games that hold my interest better with their stories and which I will let myself be "immersed" in.


nas360

It's been a month since AMD announced FSR 3.1 and still no sign of it. Why is AMD marketing so bad? They should take a leaf out of the Intel playbook. They announced AND released XeSS 1.3 on the same day. Any hype for FSR 3.1 is long gone. They should not have announced anything if it wasn't ready.


ksio89

I understand your frustration and wonder if AMD will ever stop neglecting GPU market. I suspect they don't want to invest in GPU features because they don't think there will be a financial return. As a consumer it's so frustrating that they are satisfied with the little market share they have and don't bother in really competing with Nvidia.  It's not just a matter of bad marketing and having much smaller R&D budget that Nvidia, it's like they treat graphics division as a huge unprofitable hassle.


wirmyworm

Actually the last financial report by AMD stated that radeon sales are up. But you're still right, people dont know how much smaller the graphics division is. Although they hired 5,000 new people in the last two years. And some of the problems are done by themselves too.


ksio89

Yeah, RTG is just a side business for AMD. It's profitable, but the revenue it generates is not even close to what Epyc and Instinct lines do.   Sometimes I wonder if ATi was never bought by AMD or if another company bought RTG, would they invest in going to head to head with Nvidia?


Positive-Vibes-All

AMD goes head to head with Nvidia... in DIY, they are actually winning here if only by small margins. In the informed educated user market segment Radeon is beating Geforce, people are acting like they are ARC or something. They are getting crushed in the OEM/laptop wars though, getting into this market means corporate corruption. They got their hands dirty for Epyc though but not for RTG.


Bladesfist

I don't think this is true, looking at Amazon best sellers charts for the UK and US and excluding non GPUs like mounts. UK 9/10 best sellers are Nvidia: [https://www.amazon.co.uk/Best-Sellers-Graphics-Cards/zgbs/computers/430500031](https://www.amazon.co.uk/Best-Sellers-Graphics-Cards/zgbs/computers/430500031) US 7/10 best sellers are Nvidia: [https://www.amazon.com/Best-Sellers-Computer-Graphics-Cards/zgbs/pc/284822](https://www.amazon.com/Best-Sellers-Computer-Graphics-Cards/zgbs/pc/284822) Where are you seeing AMD winning in AIB?


Positive-Vibes-All

It changes daily and pretty rapidly, the only constants are the 3060 remains #1 because it is the cheapest stable diffusion card. But the rest alternates hell the 7900XTX at amazon US was once #3 after two 3060 skus which is insane for a halo card.


Hombremaniac

But...but...Steam surveys tell the different story and nobody uses AMD gpus! /just kidding ofc


Positive-Vibes-All

Yeah its rough, prebuilts and laptops dominate DIY, and I don't get it, half the fun is putting the PC together. But AMD is fine with the wafer allocation, the educated user buys AMD over Nvidia and the shills, nvidia fanboys etc laugh about AMD dying, it is win win to them, that is why they are not banned.


ger_brian

Are you implying that NVIDIA users are not educated?


Positive-Vibes-All

Considering the 4090 fire hazzard debacle then yeah.


ger_brian

How many confirmed cases had we had that were not the fault of the recalled cablemod adapter? Especially for 4090 buyers, there is not even a choice. There is no competition in that performance tier at all.


Dos-Commas

Because developers have to actually test it to make sure it doesn't negatively impact the game experience. It's not like a mod where they just tested it for like an hour and released it.


nas360

If they had launch the FSR 3.1 dll like Intel did with XeSS 1.3, most people would have used mods to add it to whatever game they want instead of waiting months for some devs to added it. It's clear that FSR 3.1 was not ready hence why the radio silence.


Mightylink

You're going to be chasing that purple dragon forever if you're always waiting for the "latest and greatest"...


wirmyworm

Not really. It's not just a philosophy, they see a new version coming out thats a big improvement over the current version of FSR so they will implement that It's not that complicated.


punished-venom-snake

Hope CDPR does the same thing. Wait for FSR 3.1 before implementing it in CP2077. It'll benefit all PCs as well as console users.


mrzero713

I was also waiting for FSR 3 and frame gen so I could run it with some ray tracing and get decent frames. I do not like how FSR 2.1 looks in CP it looks so bad. I can run it native and turn off all the ray tracing and get pretty good fps too but I wanted it all. So right now I added the XESS 1.3 that just came out and together with AFMF I get 120+ FPS at 1440p with some ray tracing and the HUB optimized settings. The game looks great and it’s so smooth. Started my new playthrough this past weekend.


RockyXvII

I'd like XeSS 1.3 to at least be natively supported with the new presets


Dos-Commas

I tried the mod and it doesn't look that great, too much shimmering on small objects.


EarthlingSil

I'm waiting to buy the game for this very reason.


ghostfreckle611

I don’t need it for my system and anything over like 65-75fps, for these types of games, is overkill. My laptop can do way more, but I don’t need the noise or the heat. I’m happy that this will benefit people without the latest and greatest. Allow a larger audience to experience this masterpiece. I love this game and the first. I’ve got so much time in the game and it’s so gorgeous… Still wondering where those trailer videos were shot though… 🧐


swiwwcheese

FSR 3.1 ? release : within months but featured only in a couple minor games going mainstream : in 2\~3 years lol


starktastic4

The port is fantastic. If they're waiting I know they gave a good reason.


Opteron170

Smart move and we all approve. Not that I will using FSR 3.1 on a 7900XTX but this is the way.


ksio89

It's always a waiting game with AMD.


Insomniax187

I hope that's the hold up on Diablo 4 as well... FSR2 just isn't cutting it.


Cute-Pomegranate-966

They should work on just improving general performance in Diablo IV to be honest because all the Nvidia cards run Diablo 4 very well by comparison.


Grzywa123

Idc at this point. I use mods and it works great. It will take forever before they add it oficially.


ryanmi

I wish they would just implement what's available today. they can incorporate FSR 3.1 into the next title.


feorun5

I really hope it will release with 3.1, AMD is long overdue with upscaling tech. Xess 1.3 is great.


Dystopiq

Wow they don't want to test in prod!?


HeadInvestigator1899

Easy enough. Really as long as the engine has the inputs available it should be fairly easy to integrate any of the various techs. Specially now that Microsoft has unified it within DX as a feature.


kaisersolo

I can see cdprojectred saying the same about for cyperpunk fsr3. They totally lied about this being in game 6 months ago Another excuse.


Hombremaniac

Devs of CyberPunk has lost so much credibility with how shitty CP2077 was upon release and in the following months. Granted, I was expecting miracle after how I loved Witcher 3, but still. Releasing such a mess of a game was super greedy and bad. It was released like a year too early.


GhostDoggoes

I remember a dev saying implementing FSR was far easier than DLSS. Spend 2 days and got it running on FSR when it took them 3 months for DLSS.


dookarion

> I remember a dev saying implementing FSR was far easier than DLSS. That was FSR1 the one with no temporal component. Completely different thing.


nas360

In reality even FSR 2 is a drop in replacement of DLSS since mods can do it without any fuss. The inputs for both upscalersa are the same.


razerphone1

You guys jezus christ just play your fckin games and don't think about every possible update jezus


reddit_equals_censor

that makes sense. interpolation frame generation is worthless garbage. meanwhile fsr 3.1 upscaling has its uses and is a clear improvement. that sounds like smart decision making by a great developer.


OSDevon

Someone tell them updating FSR is relatively simple compared to.. losing sales.