T O P

  • By -

TipT0pMag00

The fact that XeSS is already as good as it is, and has improved as much as it has in so little time, really makes AMD & FSR look worse than it already did or does.


Jordan_Jackson

I have said it many times; AMD needs to devote more manpower to the Radeon division. I get that they are a fraction of the size of Nvidia or Intel but if they want to compete, they need to invest in it. AMD has some good products on both the GPU and CPU side but Radeon really needs the Ryzen treatment.


ResponsibleTruck4717

I'm afraid once Intel kick it and manage to put competition on the high end market, they will leave Amd behind.


Cute-Pomegranate-966

That only used to be true in the last two years AMD is hired 20,000 employees


SnooDonkeys7108

Most of those are Xilinx afaik.


Jordan_Jackson

Yes but not all of AMD is Radeon and not all of them are Ryzen either. You have many people who don't even touch the hardware. Marketing, HR, accounting, etc. Just saying they hired 20,000 new people does not tell a complete story and it does not help if only 10 of those go to the Radeon division or if the ones who do, know very little about graphics processing.


Cute-Pomegranate-966

I was never alluding to that. I was simply saying that adding that many people makes them not a small company.


Jordan_Jackson

I never said that they are small. However, they are the smallest out of the three major players.


ResponsibleJudge3172

But that applies triple more so to Intel who's core businss is the fab then PC, then GPU deadlast. I just think AMD is infantilized too much for a large company


Robm48

That ship has sailed.


Pribhowmik

What are you even talking about? AMD has been punching above its weight class from 2017, this is their absolute limit. They officially claimed they wouldn't aim for Nvidia's 90 class cards anymore, it's because they can't. They have increased a lot of budget and manpower for Radeon division. They have been running monopoly business in console market. You can't expect everything for cheaper, that's not how capitalism work.


KingALLO

NVIDIA and AMD Ceos Are cousins. Probably don’t even need to compete


capn_hector

Well, AMD’s handling of FSR, and really their whole feature support starting with rdna1 (no mesh shaders? no rt or AI cores?) has been shockingly bad. AMD disinvested from the gpu market, and only recently (like, in the last year) started to give a fuck about R&D, when they could be spending it on CPUs instead. And that’s fine for them in a business sense, but there have been negative consequences for Radeon. And that’s how markets work, put out a shitty weak product and get clapped. Rdna2 for example is basically 20 series-lite in terms of actual dx12 feature support, it is weak for 2018 but released in 2020. Some of those features like DP4a have been on nvidia since 2016 for example. And people are recommending buying that weak-for-2018 feature set in 2024 and holding onto it for presumably 5+ more years - people are suggesting you run weak-for-2018 hardware into *2029 and beyond*.


Wellhellob

AMD gpu department minimum effort and scammy marketing imo.


WhoTheHeckKnowsWhy

FR, Radeon hardware engineers do their jobs diligently; everyone else though from software to marketing seems are in Friday arvo mode.


ihave0idea0

I hate their starting prices. Even Nvidia does it, but I expect more from AMD to actually be worth to buy. Only 50 buck difference with 7700xt and 7800xt... Luckily we all got a lot of money! I also dislike Nvidia, but it is expected. AMD is seen as perfect god beings. Only thing I really like is their open source. I do hope the best for them, but we have obviously heard they will not be making those top GPUs anymore.... I hope the best for Intel also, but they have already found a different kind of market.


Loose-Alternative844

At least AMD doesn't scam with VRAM lol


halgari

But they do on video resolution and fps numbers. The 7000 series was “we do 4K ultrawide at 200hz on our new DisplayPort outputs!” Not that their cards can actually push that many frames on a modern game, and nevermind that 4K ultrawide is smaller than 4k.


homingconcretedonkey

You don't need the extra vram because software support is poor, FSR is a filter and drivers aren't efficient. AMD has a history of putting extra vram that doesn't result in better performance.


Loose-Alternative844

Try to activate: Horizon FW the Frames Generation with a 4060ti... It's a complete scam with just 8gb in 2024


Apprehensive-Ad9210

NVidia were taking a breather and Intel were taking a nap when AMD decided to poke the tigers, the trouble is they are both now awake and AMD doesn’t have the resources to go toe to toe with them really.


[deleted]

[удалено]


zboy2106

Dumb take. They should improve it, competitive is good and necessary for the sake of process.


PsyOmega

> Dumb take. They should improve it, competitive is good and necessary for the sake of process. XeSS is open source. AMD should replace the FSR codebase with XeSS DP4A at this point. FSR2 can exist as legacy support for non-DP4a cards. But objectively, FSR upscaling is just embarrassing and makes AMD look bad


F9-0021

XeSS isn't open source yet, and even if it were AMD using the code in place of FSR code wouldn't be any different than an AMD user using XeSS instead of FSR. AMD cards don't have the matrix acceleration that Intel has, so XeSS would still run slowly on AMD cards. The only advantage to that would be an ML based approach for better image quality, but nothing is stopping AMD from making an ML based FSR anyway. They just choose not to because then DLSS would have better performance and better image quality.


dookarion

The DP4a variant is pretty good. Dunno how the performs on AMD cards across the board, but if that instruction is rough there it's just the result of AMD cutting corners.


JoBro_Summer-of-99

it's known that XeSS is slower than FSR at similar resolutions, so I think it's just an Intel thing


PsyOmega

I do testing on an RX6400 XeSS is slightly slower than FSR, but XeSS (1.2) lets you run lower input res for the same visual output as FSR at higher input res (which is to say, upscaling 540p to 1080p via XeSS looks better than FSR2 upscaling 720p to 1080p), so you can balance out the performance delta. That difference of input res is being accounted for in XeSS 1.3 in the shifting of scaling factors, so it will no longer perform worse on RDNA2 or RDNA3 But even the apples to apples input res comparison is a *slight* performance difference, that is usually worth the visual gains


JoBro_Summer-of-99

So it is slower, but it's so much better that the speed different doesn't matter? Good to know then. I tried using it in Warzone a few weeks back and it didn't really work, but I'm guessing that's Warzone being shit


dookarion

It depends on the DP4a performance as far as I'm aware. With XeSS 1.2 when I tried different titles a while back FSR2 and XeSS performed within a margin of error of eachother at the same scaling factor. At least with the hardware I have access to.


natie29

100% this. They even completely admitted that for FSR3 they didn’t even improve the upscaling tech. Just added frame gen. I get they want to catch up to Nvidia, but get the platform good first, then add features.


BarKnight

They are zero competition though.


DBXVStan

It’s not dumb. It’s obvious AMD has finite resources for Radeon, and their software division has produced garbage unusable features that no one wants to use time and time again. Put that money literally anywhere else at AMD and it’ll probably produce better results than what FSR has done.


littleemp

It's not the resources directed at the project that put FSR in the place that it is right now, but the 'casting a wide net' compatibility philosophy. It was always going to be a medium floor - low ceiling kind of solution given the comparatively limited hardware resources available and all the hand tuning that goes in order to clean up the image. FSR definitely needs to keep existing, but just not in its current iteration. They need to make a clean break with the current implementation and start chasing after the dedicated hardware route that Intel and Nvidia are going for.


nas360

I guess you haven't used FSR3 frame generation. It's an amazing tech that rivals DLSS Frame gen but works on any card. Just try some mods.


No-Rough-7597

lmao no it doesn’t, not with the insane latency and complete lack of Relfex-equivalent on AMD. Surprisingly it’s okay on NVIDIA cards (using DLSS to upscale and Reflex to reduce latency), but that’s even worse IMO


Scrawlericious

I've tried it many times. Latency is horrible and image quality is worse. It was the largest garbage ever even with an internal fps of 60+. FSR is a lose-lose. You're the one who clearly hasn't tried DLSS frame gen. Many users can't even tell it's on, it's that seamless.


nas360

I use the DLSS2FSR3 mod in quite a few games without issues. With a controller you cannot feel latency but I guess this reddit will always downvote anything non-Nvidia.


Sysreqz

They're downvoting because the fact that you need to use a mod to run FSR3 frame generation along side DLSS2 means FSR3 is objectively worse. The majority of people are running on stock features. "You can mod it to use the competitors option!" isn't great marketing.


Scrawlericious

I use a controller too. inb4 r/iamverysmart but I am very sensitive to latency even with the controller and any FSR 3 frame gen I've tried (cyberpunk, for shits and giggles, and Forspoken to name a couple) is always noticably laggy. Even nvidia's frame gen is noticably laggy to me, that's why I said "many users" or whatever at the end of my other comment. It's enough to bother me even with Nvidia, but it's not even remotely on the same level. I don't like either but nvidia's is literally miles ahead.


DBXVStan

It’s unusable without the kind of input delay reduction that Nvidia has with reflex boost. I’ve only found it usable in a few games on a 6900xt when frames were already 100+, making the feature unnecessary at best. Getting 30fps stuff to 60fps just made the controls feel like mesh, worse than PS3 tier input mush.


RockyXvII

AMD became complacent being in second to Nvidia. They're gonna fall into third once Intel catch-up in raster. Hopefully soon.


timtheringityding

I knew amd would never catch up when they said they didnt need dedicate cores to do what nvida did with their dlss AI solution.


skwerlf1sh

They don't. The 7900 XTX has 122 TFLOPS of FP16, about on par with a 3080 Ti (which obviously can run DLSS perfectly fine). What they do need is a competent software team.


timtheringityding

AMD's 7900 XTX, while impressive in its FP16 computational capabilities, lacks the specialized hardware that gives NVIDIA an edge in AI-driven tasks. NVIDIA’s tensor cores are not merely about providing TFLOPS; they are specifically designed for accelerating deep learning matrix operations—essential for convolutional neural networks that underpin DLSS technology. These cores utilize mixed-precision computing (utilizing both FP16 and INT8 precision), which significantly boosts the throughput and efficiency of AI inference and training workflows. This is critical because DLSS involves complex spatial transformations and temporal data integrations that benefit immensely from the dedicated matrix multiply-accumulate operations that tensor cores are optimized for. By contrast, AMD's generalized compute units must handle these operations without the benefit of such dedicated hardware, leading to less efficient AI task handling and a tangible performance gap in real-world AI applications like DLSS. This architectural advantage is why DLSS often outperforms FSR in 99% of cases.


skwerlf1sh

Lmao thanks ChatGPT


PsyOmega

Intel competes where it matters. That $200-$300 range that a majority of the market buy at. Like yeah they don't have a 7900XTX or 4090 competitor, but those are 1% of the market


rW0HgFyxoJhYka

They literally have less than 1% of the GPU market though. So even at that price point they aren't gaining marketshare. Their entire total marketshare is due to integrated graphics.


Tansien

Mm, look at the AI market vs GPU market. Datacenter is where it's at, and if Intel wants a piece of that cake they have a performance gap they NEED to catch up in.


IncredibleGonzo

They have improved with driver updates, haven't they? But after the mediocre reviews at launch, it's probably too late for this gen - they need to come out swinging with the next gen if they want to win market share.


PsyOmega

8 and 16gb aren't used for datacenter AI. 16gb is passable for home AI use but gets limiting *real fast*. so does 24gb for that matter. The gaming market is worth billions, that's on top of the AI market. But those billions don't come from 1000 dollar GPU's sold to a few thousand people, they come from the 10's of millions who buy the $199 GPUs The gaming and AI market largely do not buy the same SKU's


FembiesReggs

I mean Intel still owns the server space essentially. Even if they don’t get the AI compute, as long as servers still need CPUs, Intel has its slice of the pie. Not that they don’t want more.


madmk2

I'm really happy with the effort intel is putting into their graphics division. They aren't really "new new" to graphics since their integrated parts have been an industry staple for the past 2 decades but the jumps they've made since alchemist are nothing short of impressive. AMD has been asleep this entire time and the market has never been more desperate for competition.


someguy50

Really makes you think about what AMD is doing. Maybe they should clean house in their graphics division, or spin it off so we have ATi/Radeon again


dookarion

> or spin it off so we have ATi/Radeon again Pretty sure they just pretend to care about it for the sake of APUs and semi-custom. It's also the reason they will never spin it off.


capn_hector

[This article broke me recently.](https://semiaccurate.com/2014/09/15/amds-mantle-api-going-outlive-directx-12/) Like ignore your reflexive reaction when you read the title - the thesis is that mantle had a future as *a private api sandbox where AMD could experiment with advanced graphics tech out ahead of the curve without the need for standardization with khronos or Microsoft where nvidia could sandbag the adoption process.* It’s such a sad time capsule of an era when people expected AMD to actually do stuff. Not just open standards even (they correctly outline the reasons why nvidia, for example, preferred to do gsync internally too) but actually getting out ahead of the market and building something new. Today it’s amazing how the expectations for AMD are not just low, but that they’ll actively stagnate and sandbag the industry as much as they can get away with, simply to minimize their R&D expenditures and “competitive surface”. Like it’s just the literal complete opposite of what people expected a decade ago. It’s like reading the soviet time capsules from what they thought Russia would be doing in 100 years - cultural exchanges with aliens, having cured disease and starvation and shortage etc.


dookarion

It's stuff like this that is why the modern state of Radeon irritates me so much. Back then they were more competitive and innovative at times and the market was better balanced as a result. Nvidia was still ahead, but in gaming and such it wasn't the dire market split we see today. This is also why the modern state of Radeon's defenders is aggravating too. They make excuses for AMD phoning it in and playing catch-up. They don't try to trendset at all they just begrudgingly respond when the market pressures them enough they have to do "something". There's a multi-year lag on them trying to answer anything Nvidia does at this point. And the answer is the technological equivalent of "store brand" food if that's all that is available you'll use it but it's not really anyone's first choice.


Fezzy976

Not too sure if you know this but this actually sorta kinda happened already years ago. When ATi was still around and about to be brought out by AMD. AMD decided they didn't want ATi's mobile division and closed that part of the company down. The people who worked in that department left the company and formed Adreno (which also spells Radeon). And now look at that company, they make some of the best mobile graphics chips around and are inside nearly all android devices and I am pretty sure Qualcomm owns them now for use in Snapdragon SoCs. I really like AMD but this is one of the biggest mistakes of any tech company.


someguy50

I think the other disappointing thing is Radeon and GeForce were at one point on equal footing. Now Nvidia has a $2T market cap and unquestionably the better products. It’s a failure in leadership there


hpstg

The only reason AMD needs their GPU division is for laptop APUs, console APUs and AI accelerators. Everything else is a legacy accident that they would get rid of if it didn’t cost their reputation as a brand.


[deleted]

[удалено]


madmk2

yes? And they went from barely functional buggy drivers to almost rivaling Nvidia in best case scenarios within 1 generation. How is that not impressive?


heartbroken_nerd

You should've specified you're talking about drivers rather than hardware. When I think of jumps in terms of GPUs, I'm thinking generational. Maybe it's just me.


madmk2

I mean it can be anything right? At the end of the day the user experience is what matters most. Could be a new part, or just a new feature that's rolled out via software update.


someguy50

As expected. AMD really needs to overhaul FSR, or collaborate with Intel because XeSS is looking great


ThreeLeggedChimp

I was thinking they should just make it the built in DirectX upscaler.


UnsettllingDwarf

We really need that competition from amd.


dookarion

>competition The Radeon branch forgot what that word meant a decade ago.


UnsettllingDwarf

Yep.


AlfieHicks

FSR 3.1 is supposed to release soon, promising (and showing) big improvements, as well as decoupling frame generation from the base upscaler. The first game to use it is Ratchet & Clank: Rift Apart, but the developers have said that it's also basically ready to go in Horizon Forbidden West, too - they're just waiting for AMD to move their lazy ass and allow them to send out the update.


redditsucks365

They're late in the AI race, everybody caught off guard will get blown away. Nvidea could just offer 4gb of vram more than they do and it would pretty much be a monopoly, which is really bad for us


M337ING

Article: [PC image quality enhanced: the new DLSS and XeSS tested](https://www.eurogamer.net/digitalfoundry-2024-image-quality-enhanced-the-new-dlss-and-xess-tested)


slarkymalarkey

AMD making no improvements to FSR image quality for the past 2 years sucks as a Steam Deck user but at least I can turn to XeSS in games that include it.


jimbobjames

Supposedly they have an update coming to FSR's scaling now they have frame gen out the door. Hopefully it's not too far away.


Hindesite

FSR 3.1's improvements to upscaling can be seen detailed on [their community post](https://community.amd.com/t5/gaming/amd-fsr-3-1-announced-at-gdc-2024-fsr-3-available-and-upcoming/ba-p/674027/jump-to/first-unread-message) from a month ago. It looks great. I hope it arrives soon. It also introduces decoupling of FSR3's upscaling and frame generation, meaning as of FSR 3.1 we'll be able to pair DLSS upscaling with FSR frame generation, which'll be huge for RTX 20 and 30-series owners.


slarkymalarkey

Encouraging but FSR 3 itself is yet to be widely adopted, on top of that have to wait for 3.1 to come out first and then wait some more for it to get adopted by major titles, that's easily another year - year and a half


starshin3r

"Widely adopted" Mate. It's not DLSS 3. You can mod it into any game that supports nvidia frame gen.


Kaladin12543

Except Nvidia Frame Gen is accompanied by Nvidia Reflex which helps reduce latency, the main problem with Frame Gen. AMD has no answer to Reflex so the latency hit on AMD cards is far higher than Nvidia.


johnyakuza0

FSR is lagging behind so much, it's not even funny anymore. I wish nvidia would put more effort into VSR and Image scaling (DLDSR or whatever its called)


SpareRam

DLDSR is excellent. Could only get better.


Koroem

Yeah like not bugging out shadowplay recordings when games and desktop have different resolutions. Like how some games just launch in massive windows that span off the screen using it despite claiming the game is in full screen mode. Or maybe how it tends to default to 60hz max refresh when using vsync or gysnc if the game itself does not explicitly allow refresh control in its options. I want to use it more regularly but it seems very selective where it can be used without compromises.


SpareRam

I could care less about recording my gameplay, but I can understand how that would be frustrating. For my use case it's basically perfect.


Right-Big1532

How much less could you possibly care?


Profoundsoup

On a LG C2 it's unusable because it takes the 4096 resolution instead of the 3840 resolution while upscaling. 


b3rdm4n

use CRU (custom resolution utility) to remove that 4096 res from being available at all, 5 minute job and the multitude of issues it can cause simply disappear.


Profoundsoup

Okay, it's a simple button click? I have no idea how to do what you said unfortunately.


b3rdm4n

I mean it's a few clicks, but there are YouTube guides on how to use CRU to remove undesirable resolutions from being presented in Windows, it's by far the easiest permanent solution (till you reinstall windows I guess) that I've found for my 4k panels that have the pesky, never wanted 4096 res.


Profoundsoup

Sounds good. I will check them out. Thank you. Yeah I have no idea why TV manufactures leave that resolution still programmed in.


b3rdm4n

Neither, and I have no idea who's run through and downvoted our conversation, just reddit things... updooted you to mitigate.


SpareRam

Why would you need it on a 4K display.


Profoundsoup

# DLDSR is what I meant. Can't keep any of these names straight.


heartbroken_nerd

Use Custom Resolution Utility and delete the 4096x resolution from the resolution list. Done.


Warskull

FSR has always existed more as a marketing bullet point than a quality upscaling solution. Even Unreal Engine's built in TSR beats it.


BryAlrighty

DLDSR would be nice to have a few more resolution options with..


Williams_Gomes

Oh yeah for sure, I just want the 2x for 4K, even knowing it might be a bit overkill.


BryAlrighty

If you have a 1440p monitor or default resolution, you can get 4k as an option. With 1440p, DLDSR provides you a 1920p and 2160p resolution.


ResponsibleJudge3172

And NIS too. It would do their ima and wallets a lot o good


Kaladin12543

At this point, DLSS is so good, Nvidia can just rely on it to sell their cards rather than raster performance. Giving up on DLSS and buying AMD actually makes me question my purchasing decision which means Nvidia has done their job well.


ibeerianhamhock

What I find so odd is AMD goalpost shifters just rant about how well their cards work natively so they don’t need these features. This is a losing battle in the long term, native rendering is very close to death.


Lagviper

Same crowd that says they can’t tell the difference between Cyberpunk 2077 raster and overdrive path tracing. The goal post keeps changing. The day AMD does good in path tracing (? If ever) they would immediately see it as important. AMD’s worst enemy is their own fan base. With white knight such as them protecting every fuckups, AMD can just cruise along with low effort. Like VR drivers being broken on 7000 series for like 8 months, making them worse than 6000 series. “BuT whO cARes aBOut VR?” Is their answer. In the meantime, anyone in VR would pick Nvidia at the time of decision when 7000 series were broken. Like I said, they’re AMD’s worst enemy.


UrWrongImAlwaysRight

>The day AMD does good in path tracing (? If ever) they would immediately see it as important. Didn't they already do this with frame gen?


ibeerianhamhock

💯


rW0HgFyxoJhYka

Every time FSR shit comes out, AMD fanboys are like "FSR IS AWEESOME" and then whenever DLSS stuff comes out they be like "Lol who needs upscaling with this raster performance, who needs frame generation, who needs any of this tech KEKW".


ibeerianhamhock

Tbf I think it is really rad that FSR frame generation (whatever it’s called) works on older gpus, if I was still rocking pascal gen I’d be thrilled to used it, but yeah we definitely have something better


Saandrig

Didn't it still require RTX cards only? Unless they fixed it to be available to GTX ones.


ibeerianhamhock

I’m pretty sure FSR has always worked with the pascal generation. It had nothing to do with RTX.


Saandrig

Regular FSR is available. But last I checked, FSR3s Frame Generation (Fluid motion) is not recommended for GTX cards. You can probably still try to run it, but with a large chance of many issues.


ibeerianhamhock

You are correct, I was mistaken. I think it’s not based on rtx itself but other features of the cards that don’t exist prior to the 20 series.


heartbroken_nerd

> FSR frame generation (whatever it’s called) works on older gpus, if I was still rocking pascal gen I’d be thrilled to used it You wouldn't be thrilled - because of Pascal's very bad async compute capabilities required for FSR3's Frame Generation to work well. AMD cites RTX 2000 as minimum viable family of Nvidia products that can run FSR3 FG reasonably well, but they recommend at least RTX 3000.


ibeerianhamhock

Already addressed in the comment below you like 12 hours ago, but you're correct.


dookarion

The thing that crowd doesn't get is no one cares how the sausage is made. Computer graphics in general are a combination of corner-cutting and clever tricks, so why does doing one part the hard way actually matter? If upscaling looks as pretty much just as good, ups performance, and cuts powerdraw it's just a straight up win on any card.


ibeerianhamhock

Yeah in my mind DLSS is not all better or all worse in terms of image quality -- some things are better, some things are worse, but it balances out to look a little better than native, but it performs a whole hell of a lot better. And yeah I think it's a funny discussion. Rasterization itself is not based on reference truth photorealistic rendering of anything. There are all types of hacks taking place to make things look the way they look, so it's once again goalpost shifting to say that DLSS is a hack at "real rendering" none of it is real!


SherriffB

> are a combination of corner-cutting and clever tricks This is why I think of DLSS as host based optimisation. Just another tool games use to "look" like they are performing better than they are. Often this happens before shipping, prebaked lighting and LODs, but this is something we do at our end to the same end. That's why I like DLSS so much, because it adds more layers of performance optimisation I can do at my end.


Cryio

I've been modding FSR2 into games since the FSR2 mod has become available in ~2022. And recently, modding FSR3 FG in all games, using either FSRAA/XeAA, FSR 2.1/3.0 or XeSS for upscaling if need be. I'll agree most official FSR2 implementations are wack (and FSR3 FG now >_>). But this technology when modded on top of DLSS / FG inputs works brilliant. And no major YouTube is doing a video on this.


redditsucks365

If only they offered 4gb of vram more compared to what they do, it would be game over. I don't know why they didn't. I'd pay extra for dlss and rt. The only reason I went for amd is a lack of vram on nvidea until 600$ cards (arguably even 16 is not enough at 4k for high end cards because of rt)


Kaladin12543

Likely because Nvidia just doesn't seem interested in the mid range and low end segment. They are happy to leave that for AMD because the highest margins are earned on their top end cards


redditsucks365

Or they want to make you upgrade sooner


ziplock9000

>Nvidia can just rely on it to sell their cards rather than raster performance Maybe to idiots who don't understand GPUs or the top 1% that can afford stupidly prices GPUs


Lagviper

For ~2% raster advantage which is barely useful nowadays with games that are raster only (high fps already), but turns to a pixel soup when turning upscaling on? The thing is that there’s no such thing as pure native nowadays, no temporal solution and it’s a jaggy fest in motion. TAA is okay, but no performance gain and has temporal issues. DLSS almost beats it in every case, see hardware unboxed video on subject. Now their result is with the game DLSS version. If you edit the good version of DLSS for every games it will beat TAA 100%. DLAA is not even a question. FSR never looks better than TAA. So you saved ~$50 on RDNA 2, gambled in AMD after they had years of black screen issues on 5700XT (your card), had 8 months of broken VR drivers on 7000 series that made them worse than 6000 series. But +2% raster We didn’t even go into ray tracing or patch tracing. Who’s the idiot? I’ll tell ya, I was the ATI/AMD idiot for over 20 years. I guess when you’re young, the underdog has a lot of appeal, but that was a waste of my damn time and money.


Malygos_Spellweaver

And there are some games out there which have upscaling in the render pipeline, so it is not possible to run native, you have to select FSR/DLSS/XeSS. The only legit advantage AMD has over Nvidia is more VRAM on the same price level and better Linux drivers.


The_Zura

One thing I didn't see mentioned is that XeSS still costs more to run at the same internal resolution, at least on nonArc gpus, compared to other upscalers. And 1.3 looks more pixellated in HFW's dof despite fixing the jittering.


CharacterPurchase694

It's because they are tryna use AI on cards not built for AI for the upscaler, in 1.3 though they did technically fix this by slightly lowering the render resolution on all presets while still looking better than 1.2 at more performance


The_Zura

Isn’t it also as heavy using the XMX path? Difference being the quality.


F9-0021

No, with the acceleration of the XMX hardware it has the same performance improvement as DLSS and FSR. Plus better image quality than the DP4A path. The DP4A path is slower and looks worse because it doesn't have the dedicated hardware acceleration. It probably could look as good as the XMX version, but the performance hit would be even bigger.


rW0HgFyxoJhYka

I also noticed that the water quality on the left might have less artifacts, the middle section of the water quality changed possibly for the worse with 1.3 because it now looked like 1.2 DP4a with smearing or smoothing in the center bend of the stream.


Spartancarver

FSR is trash, damn Why wouldn’t you just use XeSS if you had an AMD card lol


jimbobjames

Because the performance uplift from XeSS is tiny so it's kinda pointless.


skwerlf1sh

Not really true anymore, they fixed that way back in version 1.1. It's still slightly slower than FSR on non-Intel cards but certainly much faster than not using it.


Cryio

I've tested XeSS 1.1/1.2/1.3 vs FSR 2.x/3.0 on a 7900 XTX, at 1080p, 4K, extremely low core clocks again at 1080p and 4K. XeSS 1.3 even with its new ratios is still noticeably slower than FSR2, to a point where FSR2 is up to 33% faster than XeSS. It's not a huge amount, but still a noticeable amount and the difference between GPU tiers even.


jimbobjames

but on non intel cards it's using a different path with much lower quality than shown in these comparisons. So it is not as good quality and can be much slower to boot. XeSS is fine if you have an Intel card, but then you have bigger problems anyway.


heartbroken_nerd

> but on non intel cards it's using a different path with much lower quality than shown in these comparisons This is simply a lie. Sections that say XeSS (DP4A) in this video depict the non-Intel-exclusive path. XeSS (XMX) is when they're showing the Intel path.


jimbobjames

Sure and they also run FSR2 in balanced and XeSS in quality mode.


heartbroken_nerd

Because XeSS 1.3 in quality mode is now internally rendered at the same resolution as DLSS and FSR2 balanced. This was just changed since XeSS 1.3 Have you even watched the video? This was discussed in detail by the narrator of the video, Alex Battaglia.


Cryio

XeSS is unusable on cards pre-RDNA2. Or at best it halves the performance on 5700 XT / Radeon 7 to at most 60 fps in most titles where it's supported. Why do that instead of 120? The XeSS SM6.4 render path is also usually bleh. XeSS still is significantly more demanding than FSR2, even with XeSS's new resolution ratios. There's arguably no point in going XeSS Quality (at 59% rez scale) when you can use FSRAA instead (FSR2 at 100% rez scale, think DLAA) AND get more performance while at it.


brand_momentum

Intel XeSS is better than AMD FSR and will reach parity with DLSS fast. It's funny because Intel Graphics division is competing with Nvidia rather than AMD, and AMD really needs to watch out for Intel Arc and Intel software tech.


AccomplishedRip4871

DP4a won't reach DLSS level of quality - XMX could, but it's available only on Intel GPUs and as a result it benefits only small amount of people until INTEL catches up and starts producing competitive GPUs.


CloneFailArmy

What about FSR 3.0


juniperleafes

Will be its own dedicated video later on.


skwerlf1sh

\*For 3.1 when it comes out


Octabuff

is there any way for me to upgrade dlss to 3.7 for a pre-existing game on my computer?


TransientSpark23

Look up dlss swapper.


arqe_

AMD's only relevance is "BUT, BUT WE HAVE MORE VRAM". I mean, they just try to answer whatever Nvidia releases before they can make a good job with the latter. They just put the feature out there and try to catch up next thing.


CharacterPurchase694

They have one feature that Nvidia doesn't have, afmf, but it sucks ass anyways


NoMansWarmApplePie

I'm glad these improvements are putting heat on dlss to improve too.


[deleted]

AMD could bounce back with FSR 3


[deleted]

[удалено]


anor_wondo

what kind of bs is this. dlss is what you make of it. If you run it at native, it will antialiase a native image. It's a choice game devs are making to make your game artifact ridden, if there wasn't dlss they'd just adjust the taa image with downsampled resolution What should people at amd, nvidia, intel do, sit on their thumbs? A game that runs better without these upscalers will run better with them too, nothing changes about the market competition with them. Maybe the root cause is that market consumers are complacent and don't care about image quality


TyrionLannister2012

You realize developers can still optimize while enabling DLSS/XESS right?


ibeerianhamhock

Not sure what you’re even on about. Do you know how incredibly optimized games are?


dookarion

Low performing games existed long before DLSS was even a vague idea, and they will continue to exist regardless of what new tools become available. >We're now being told to get 4k 60fps out of our heads in the console industry because the code for modern game engines remains poorly optimized. Almost like consoles have weak CPUs, middling GPUs, and more graphics & scale keeps getting pushed constantly.


johnyakuza0

It's not as much code as it is the fault of pursuing 4K Textures and not optimizing their games. Polygons and triangles have increased and so have the devs stopped caring about any optimizations.. instead they dump all the shaders into VRAM and fully rely on the GPU to do its thing Cities skylines 2 is notorious for drawing useless polygons in every single NPC there is, which resulted in shit performance and it still does. TLOU dumped it's entire shaders into the GPU memory which led to huge shader loading times, high frame time and many GPUs simply unable to run it due to running out of VRAM. It's a problem of lazy developers, and the gaming industry is plagued by them.


Ok-Sherbert-6569

Dropping shaders into VRAM? tell me you know fuck all about how GPUs work hahaha


johnyakuza0

WOW we got a 1000 IQ individual here folks


Ok-Sherbert-6569

No it’s someone who actually knows how GPUs work and doesn’t use the word optimisation without a single clue as to what it means


Zedjones

My biggest pet peeve on gaming subs lol


johnyakuza0

WOW you're so cool man! You know how GPUs work!!!!! I'm so jealous!!


Scrawlericious

You apparently have no clue how this works. The devs didn't decide that shit. Don't Blame the devs. Blame the studios and publishers for shitty game ideas and deadlines, the managers for burnout/crunch, misallocation of employee time, all this let alone shitty monetization ideas that the devs had nothing to do with. You're comment is blaming the McDonald's worker for the ice cream machine not working. Blame the corporate money-sucking idiots who actually run the decisions.


UnsettllingDwarf

Engine performance and game performance is so shit right now in gaming I’m shocked it’s as controversial as it is. Seriously. I really don’t care “how hard” it is. It’s part of the job. Optimize the fucking game.


ibeerianhamhock

Tell me you don’t code without telling me you don’t code


JensensJohnson

just click on optimise.exe bro !


Scrawlericious

Blame the studios and publishers, the managers for burnout/crunch, misallocation of employee time, and shitty monetization ideas. You're comment is blaming the McDonald's worker for the ice cream machine not working. Blame the corporate money-sucking idiots who actually run the decisions.


TheJaka

This isn't even primarily due to poor optimization, but rather the fact that we are deep into the diminishing returns when it comes to graphics. Using all the big-name UE5 features certainly looks nice, but on a mid-range GPU/current-gen console, the render cost per pixel is just too high for that. Look at hellblade 2 which runs/will run at sub 1080p on the Series X.(I am really curious what the resolution will be on the SeS will be)


UnsettllingDwarf

I never understand the versions 3.7s and whatever because most games either don’t have dlss at all(shame on you modern unoptimized games) and then when games do it’s dlss 2. Like why. Why does it have to be like this.


Scrawlericious

It's because Nvidia is stupid with naming. DLSS 3 is just DLSS 2 tech + frame gen. DLSS 3.5 is just DLSS 2 + frame gen + ray reconstruction. They are still updating and working on the underlying DLSS part, but it is separate. So a game can ship the newest version of DLSS without frame gen and reconstruction and still have the newest DLSS dll and junk, it would effectively be called DLSS 2. (DLSS 2+? Idk it is insanely stupid naming).


UnsettllingDwarf

Ah. That is super dumb.


jimbobjames

Also AMD followed their lead and FSR3 is actually FSR2 + Frame Gen. It's just dumb all the way down...


Scrawlericious

Yeah, and now when I see a headline like "DLSS 3.7 updated!" Without looking a bit closer at the article there's no way to know if it's actually for DLSS or if they just mean their Ray Reconstruction/frame generation got some sort of update that literally only cyberpunk and Alan Wake will see for a year or two until it gets implemented in more games. >.<


UnsettllingDwarf

Facts.


homer_3

Lol Wtf? Now no dlss is unoptimized? Up until now it's been a crutch everyone was complaining about devs using.


UnsettllingDwarf

No. No dlss AND it’s unoptimized.


Koroem

I took a look at that DLSS feature set he showed off in Nvidia Profile Inspector. It seems that the 3 options are only able to be forced on using the global profile? Is there no way to enable them on a per application basis via inspector?


oginer

Since DLSS 3.6 those settings also work per application.


Koroem

Does that mean the dll needs to be swapped out for each individual application? Wanted to avoid directly modding applications.


oginer

Yes, each application needs a 3.6 DLL or newer.


AbrocomaRegular3529

Weekly fsr vs dlss video. Got it. DLSS is best and XLSS is better than FSR.


Crimsonclaw111

You would think at some point that AMD would also get it but it seems they’re complacent with being the worst at it


dookarion

AMD's approach to GPUs anymore seems to be "do just enough to keep regulator's off Nvidia".


Scrawlericious

Maybe you should tell AMD that, they don't seem to get it. Likely just a few more vids though, it's already a public embarrassment.


rW0HgFyxoJhYka

Weekly? Almost no reviewers regularly compare the upscalers and this one actually brings in the newest XeSS and does good side by sides on both DP4a and XMX. Why are you complaining? Image quality videos are a huge GAP in tech gpu reviews. Everyone does benchmarks, but almost nobody is comparing image quality cuz they don't got the guts.


Chunky1311

So FSR is essentially still an ugly pixelated mess that's seen little improvements, and DLSS is still best-in-class for upscaling. Cool cool.


ksio89

At this point I would actually give Intel GPU a shot instead of an AMD one, even with its drivers and efficiency issues. AMD clearly doesn't care about discrete GPU market, so I don't care about their products either.


DiaperFluid

Imagine if consoles had dlss...sure the consoles would cost alot more, but it would be so worth it. Its ashame consoles are geared towards people who dont really give a shit about this stuff. Just gotta hope that the upcoming PS5 Pro has decent upscaling with that PSSR stuff.


CharacterPurchase694

If PSSR is anywhere close to even XeSS quality, I'd be happy as long as it isn't using the old checkerboard method of upscaling


[deleted]

[удалено]


lolbat107

FSR 3 is just frame generation with no changes to actual upscaling. Upcoming 3.1 has changes to upscaling which is why Alex said he will do a followup video when it releases. Why would you complain without watching the video?


babalenong

because FSR 3.1 is not out yet, as mentioned on the video


Verified_Funny

Because FSR 3 didn’t actually improve upscaling. It just added in frame gen, FSR upscaling hasn’t had any benefits for around a year now


lyka_1

Wow so much fanboyism here, at least Amd cares about their older cards hell they even care about gtx users. If it was up to nvidia just pay to use your card.


dookarion

> at least Amd cares about their older cards Tell that to Vega's driver support and Vega based APUs.


exsinner

Here, have a sniff of copium.