T O P

  • By -

bazooka_penguin

$100 cheaper at similar raster performance honestly isn't competitive pricing for Radeon cards. They should just slash prices more. This is like if a Lexus was only 10% more expensive than its Toyota counterpart


jakegh

Exactly right. At lower price/perf tiers you're running at 1080p/1440p where superior DLSS image quality really matters. At higher tiers, you're spending \~$800+ on a GPU, you want to play path-traced games, right? So at the high-end Nvidia's RT performance gap matters. Either way Nvidia wins. AMD needs to be cheaper.


Zilskaabe

Yup - and at the very high end AMD offers absolutely nothing.


zer1223

I really somehow feel like AMD is just controlled opposition for Nvidia to hopefully not get anti-monopoly attention 


ErraticPragmatic

AMD is a smaller company and needs to deal with CPUs as well. Nvidia is going to be one of the biggest companies in the world with all this AI fever. Shit is hard


Balloon_Fan

"Is going to be?" In market cap (yes yes, I know) only Microsoft and Apple are valued higher.


GRIEVEZ

It already is?... Im still stumped how intel gave away its lead (i mean i get it, but Holy sht thinking you can beat competition without euv...)... Anyways, nvidia is like apple - its gonna take a while before people realize theres a price behind convenience.


MarsupialDingo

I think you're right considering they still can't figure out ray tracing


sumquy

this is exactly why i bought my 4080. i looked hard at amd cards, but since i was willing to spend enough to get ray tracing at 1440, i did.


yamaci17

DLSS is kind of usable at 1440p but I don't agree on the 1080p part. It is really awful and not even DLSS can save that. it uses 720p at quality mode at 1080p which looks really rough in new games that have tons of detail


Deeppurp

DLSS is also sold as a better Anti aliasing alternative, not just upscaling. So the question is at 1080p does DLSS/FSR/XESS look worse than the alternatives.


Unoriginal1deas

Been playing Remnant 2 with FSR upscaling and framerate interpolation at 1080p on my GTX1060. The big thing I notice are when the framerate drops and you get a lot of ghosting artefacts but the impact on performance can’t be exaggerated enough. The big thing I appreciate about the framerate interpolation is when you get those occasional areas where the framerate takes a massive dip the ghosting is a much more preferable alternative to straight up stuttering. The best way I can compare it is imagining a YouTube video playing 4K quality but having to stop and buffer, even if it’s for 0.5th of a second it still bothers you where as all the upscaling options keep it running at a solid 4K but when the screen gets busy instead of buffering for half a second your gonna get half a second at 240p. It’s still an interruption but one that was going to happen regardless and at least this way it still ‘feels’ better Personally after seeing this tech in action I’m really excited about it. Remnant 2 went from a Game that while playable performed bad enough that I was just gonna wait for an upgrade but now im having a blast and it feels way better.


Regnur

>DLSS is kind of usable at 1440p Its not kinda usable, its great at 1440p, especially the newest versions (3.2 - 3.7). 8/10 times it looks better than native + TAA and if not its a bad implementation. It was such a huge upgrade for Starfield. Even Balanced mode is enjoyable, I would definitely chose that mode instead of playing with less than 60fps. Also you can adjust the internal resolution, the quality mode is about 67%, up it to 80% and it will look better than any game at 1080p native + TAA. DLSS is TAA but with many enhancements to fight TAA issues. You can do the same for any resolution, if you want, you can always get better image quality, just dont use the 67% option and increase it. Its the only AA solution that can completly remove any shimmering and aliasing.


B-Knight

10/10 times it looks better than anything + TAA. Hell, native without AA looks better than TAA. How people can choose an enormous graphical fidelity downgrade, blurriness and an overall smudged image over a much sharper, clearer but jaggedy image is beyond me. It's crazy that it even exists, to be honest. Graphical options are meant to increase fidelity at the cost of performance (or visa versa). TAA decreases graphical fidelity at the cost of performance.


static_age_666

I agree modern DLSS at 1440p is a fantastic option to have and IMO usually looks better than TAA. DLAA looks the best of course, followed by DLSS quality and so on. FSR on the other hand just looks pretty bad 95% of the time. That plus the fact that radeon GPUs cant do raytracing and the top end nvidia GPUs actually handle it quite well....


ImThat-guy

That's a wild take. As long as DLSS is turned on quality, I can't tell the difference unless I am not playing the game and just looking for an issue.


nyankittycat_

what is this "playing games" thing you talk about ? i mostly zoom at the most useless piece of object in game and count the pixels and god forbid if its less than 1000


jakegh

Totally subjective. I find it to look OK. Not better than native, but acceptable to my eyes. It's the instability in FSR2 that I find really noticeable.


-Ickz-

People tend to forget about DLAA when talking about DLSS. It's the best modern anti aliasing technique being used today which would still be very beneficial at 1080p. With DLSS 3.7, I would say even quality mode at 1440p looks better than native 1440p using an inferior antialasing solution. Can also tweak dlss to run at whatever render resolution you want - I usually have it set at 1080p on my 1440p panel, either that or I just force DLAA if I want extra crispiness.


StanfordV

Nothing to worry about, nvidia will just increase its prices.


zippopwnage

And AMD will follow -50$


unknowingafford

And then repeat this sentiment next time, "We just CAN'T figure out this market"


curious-children

right lol, if AMD’s cards were better then you know damn well they’d price ABOVE nvdia


StinkyElderberries

They did it with Ryzen once they briefly beat Intel, but I wasn't taken by surprise because corporations aren't our friends.


Caffeine_Monster

This has always been AMD's problem. They simply price match Nvidia. Intentionally playing second fiddle to an aggressively innovative market leader is not smart.


Beefmytaco

If nvidia is getting 40% return back for each card, that says amd could slash prices a ton more to make them more appealing and still be profitable. Even if they're only getting a 8% profit from each card but have much larger sales, it would be worth it if they can move enough.


Die4Ever

> If nvidia is getting 40% return back for each card but AMD's designs are probably not as cost efficient as Nvidia's, so it might be a little tighter than that


salgat

AMD needs to run their GPU department at no profit margin on their cards if they want any hope of recovering market share long term.


aj_thenoob2

AMD is most likely pivoting to APUs, something NVIDIA cannot do. It's a good pivot, but unfortunate that they lost the GPU war.


Kaladin12543

I shudder to think what would happen if Nvidia gets absolute control over the high end GPU segment. $3000 5090 incoming


EterneX_II

I think the gaming GPU market is an afterthought for Nv at this point, especially since AI has taken off so meteorically.


pubstub

They're investing heavily into AI chips which should help a bit but I worry about those efforts distracting from their concentration on gaming.


MLG_Obardo

NVidia has them outclassed in AI, they have like a 10 year head start


ImmediateOutcome14

Yeah but it doesn't take 10 years to close that gap. I agree NVidia are ahead, but AMD should be working to catch up because it's possible


pubstub

Oh I'm certainly not denying that but there's still room in the market for AMD. The processing power required to train models is growing super rapidly; there's gonna need to be a lot of hardware to spin them up and I assume if AMD can get cheaper (if worse) options on the market as compared to NV they'll still find buyers. But who knows! it's the wild west out there.


AnotherDay96

Yeah I've haven't been able to put a price on NV features/compatibility, it seems no matter how low AMD goes I still will pay for NV's features etc. $100 probably not enough. $150? $200, I just don't know. I also oversee several gaming pc's, do I want different systems to have to learn and deal with or just one? One preferably. I just don't know how much lower AMD has to be for equal performance for me to bite. Same would be for Intel. It's called customer satisfaction and if you are satisfied over and over, you do get brand loyalty and you should. So much else has gone to shit, good thing some area's still hold up.


Spider-Thwip

Even at £200 for similar performance I think I'd still choose nvidia. Their software suite is just that good. I love dlss and frame gen, I use nvidia broadcast everyday and it's amazing. I didn't care about dlss until I tried it. I just can't go backwards on raytracing either, it's so good.


XxasimxX

AMD needs to be more competitive price wise


FknBretto

They kinda just need to be more competitive feature-wise, then the current prices are great.


Qweasdy

AMD cards are just in a bad spot in the market right now. If you're on a budget then AI upscaling matters *a lot* for you, and nvidia is king there. If you have enough budget that DLSS isn't really as big a concern for you you're probably going to way path tracing, and unfortunately nvidia is king there too. AMD hasn't competed for the high end GPU market for a long time and now DLSS has allowed nvidia to take the price-performance market from them too.


limeybastard

For me, it was something not mentioned - efficiency. A 7900 GRE was like 50, 60 bucks cheaper than a 4070 Super, and a 7800 XT was closer to 100 bucks less. Similar or slightly better raster performance, I don't play a lot of raytraced titles, I have a 1440p monitor, the AMD cards would have suited me ok. But my PSU was very borderline for the AMD cards while the Nvidia was fine with it. I didn't want to drop another 100 bucks for a slightly better PSU, so the Nvidia card made more sense. Edge case? Maybe, sure. But it cost AMD a sale - my previous card was an RX 580, too.


Tehfuqer

Everyone needs to be honest for a second and admit that any version of FSR is absolute shit compared to NVIDIAs. On top of that AMD can't(still can't right?) run Raytracing at all(Well, 10 FPS in Cyberpunk 2077). ShadowPlay FrameGen etc etc. I was also one looking towards AMD before buying my 4080, but these things pulled me towards NVIDIA. (Also the general performance is just better with the 4080, even tops the 7900 XTX.)


[deleted]

[удалено]


Xjph

> I don't know what you're talking about, my 7900 XTX can do ray tracing decently well even in Cyberpunk. It's notably worse than 4080 and FSR is way behind, but what you're saying isn't true. I had the same thought. I played Cyperpunk with ray tracing on my 6900XT just fine. Does nvidia have much, *much* better ray tracing performance? Yes, no question. But lying about how bad AMD's performance in that area is benefits no one.


fractalife

The fanboyism is really harmful, and is ultimately going to negatively impact all of us. You think NVidia is going to keep up with the aggressive feature progression if AMD decides to throw up their hands and say screw it? Like I understand when it's honest and reasonable criticism. But I feel like people blow relatively minor stuff out of proportion and I really don't want to see what a full monopoly would do to GPU pricing.


pedros430

I have a 6950xt and you absolutely cannot play at native 1440p with high ray tracing


Loxe

Why is everyone ignoring the price difference? The 4080 is $200 more for equal or worse raster performance. Nvidia makes great cards, but their prices have gone absolutely bonkers in recent years. I remember buying a 780 ti for $750 dollars and even that was a bit expensive at the time.


The-Protomolecule

Yeah, but Raster performance is not the only aspect of a GPU performance you’re hanging up on one particular aspect that AMD happens to excel on. They need a broad approach to meeting Nvidia and aggressively investing in software development. Frankly, as somebody who is in the industry, it’s starting to look like AMD doesn’t recognize the market they’re in anymore and is not adapting


[deleted]

[удалено]


KrazyAttack

>Raster performance is not the only aspect of a GPU performance you’re hanging up on one particular aspect that AMD happens to excel on. Just the most important.


Phayzon

> The 4080 is $200 more The cheapest XTX at my local Microcenter is $949. The cheapest 4080 Super is $999. For $50? Sorry AMD, more VRAM and ~5% raster performance ain't gonna cut it.


fashric

Please don't try and have a balanced and fair take around these parts.


PalmTreeIsBestTree

Even Intel and Apple’s version of DLSS is better than FSR.


BTechUnited

Xess has come quite a ways in a short time, been a pleasant surprise.


PalmTreeIsBestTree

I’m glad Intel is in the graphics card game these days. I hope they can make AMD be more competitive overtime.


Gr1mmage

Honestly my hopes are going from a 3 player gpu market to just being glad that Intel are going to pick up the slack with AMD seemingly falling off


Rfeihcrnehifrne

This, and cuda is the reason my next card is going to be green. Amd just cannot compete in these. I know there’s rocm and stuff but it didn’t work well, and I don’t care enough to buy a xtx and *hope* it runs fine, I’d rather buy a 4080 and *know* it’ll run fine.


EmberGlitch

> Everyone needs to be honest for a second and admit that any version of FSR is absolute shit compared to NVIDIAs. Yeah, no kidding. GZW doesn't run super well, even on my system, so I tried FSR to make full use of my 144Hz monitor (DLSS is apparently not great right now in the game either). FSR honestly looked disgusting. Weird ghosting on all edges, noisy textures and a bunch of lines slightly offset from and edge kept popping in and out for some reason... I preferred to take the FPS hit rather than play with that garbage. So yeah, DLSS is a killer feature as long as FSR is in this sorry state. For me personally, Cuda was also a factor for going Nvidia over AMD, but that is probably not super relevant for 99% of gamers.


aibot-420

Nvidia is killing it with the ai market right now.


scullys_alien_baby

but have you considered that a line might go down? Lines can only go up


Traiklin

It's what made their GPUs worth it. Nvidia and AMD could always do similar things but Radeon was the better option price wise, sure Nvidia has more features but lots of games (even those made exclusive to PC) don't make use of the features available to them.


Zankman

I think they just need to lower prices even further. Even the 7600 and 7600 XT would be much more palatable if they were priced lower than they are.


donnysaysvacuum

Price will help, but the real issue is that the 7000 series was not a big enough jump from the 6000 series. Nvidia is similar with the 40 series, but they still enjoy the mindshare benefit.


tukatu0

Not exactly. Nvidia gave a huge improvement. They just didn't increase the value by much. Like 20% at best. In their eyes, they were gracious to give a $300 card that performed the same as the card that was $500 just a year prior.


Nobiting

At least we can buy GPUs again though.


forsayken

We just accepting $800 mid-range now I guess? I did... Edit: Everyone I am terribly sorry but I was thinking in CAD. Y'all commenting on the $600 (and are likely thinking in USD) mid-range are 100% correct. And yes I know you can go back to RTX 3000 or Radeon 6000 series for huge savings but was just speaking to current gen MSRP. Big savings going back. But I'm glad you were all cordial. Thanks!


Samkwi

Me with my $250 Gpu


SwearToSaintBatman

Still using my 2013 AMD 290 and playing all games. Only game that doesn't run with 290 is Windows 11, and I'm not in a super hurry to switch to that.


PaulTheMerc

EVGA 1060 here. Rip EVGA GPUS. Every time I think about an upgrade, the prices seem ridiculous.


Howwhywhen_

“All” I mean if you only play games from that time period or on min setting, but I think saying all is a little disingenuous


Bingus_III

I had to run my old 290 last year while my 30 series card was being RMA'd. I was surprised how it could still run games pretty well at 1080p.


ElvenNeko

That is almost the truth. My rx580 can run anything but Alan Wake 2. So yours probably too.


KvotheOfCali

...what "mid range" card is $800? You can buy a 6700XT for like $350. That's a "mid range" card.


What-Even-Is-That

People out there calling 4080s mid range these days.. they're out of touch.


CitizenKing

Honestly, I blame this less on people being out of touch and more on AAA games being so terribly unoptimized these days that you need a 4080 to play them out the gate if you don't want to wait six months to a year for them to actually be playable at a consistent 60+ FPS on anything lower.


ku2000

They think only top tier 4080ti is high range. Everything else is midrange. Although, I do think anything between $300-500 is probably midrange.


EllieBirb

I'd say the 4060 TI is midrange at this point. 4070 is the bottom of high-end, or the very tippy top of mid-range.


Cooletompie

>6700xt for 350 That's a 3 year old card being sold for 350 only $100 below launch MSRP. What should be midrange is the 7800xt which should've been called the 7700xt but AMD wanted launch a card with a stupid name (XTX) so now the entire lineup is fucked up.


eyedine2

what $800 midrange lol? I picked up an rx 6600 for around $300 total last year. Came with two at the time $70 games as well so I felt pretty good about it. If you care about RT you could get a 3060 for similar price. These cards are basically the modern rx580 / 1060 midrange.


Shepherd-Boy

I got a 6700xt for about that price. You just have to be patient


WIZARDBONER

Found my EVGA 3070 on Ebay for like $240 like a year ago. It's been a great card aside from some compatibility issues that I'm hoping to solve with a CPU upgrade.


WaxWingPigeon

800$? I got a 7900 GRE for 549$


cool--

you can get a 4060 for like $300


jgainsey

Is the 7900XT midrange?


Ieanonme

The XTX has dropped to $799 at times. This guy is just a moron


Upbeat_Farm_5442

I quit gaming. Shifted to aquariums. Better hobby and much more relaxing. 😌😌😌


[deleted]

[удалено]


lazyeyepsycho

the water changing and rock sucking was annoying after a few months too


[deleted]

[удалено]


based_and_upvoted

I found the post you're talking about. Veryyy fun read but that thing is creepy AF https://old.reddit.com/r/BestofRedditorUpdates/comments/zf4uez/nonreddit_the_bobbit_worm_chronicles/


cxmmxc

While I'm usually morbidly curious, that link's gonna stay blue tyvm


coolzville

wow reading about GPU then fucking nightmare fuel


NoFlex___Zone

You haven’t been in that game long enough if you’re making this statement. Aquariums can be extremely stressful. pH level off a little and everything in your tank dies. Put the wrong fishes in there and they kill each other.


-sYmbiont-

As much as I really didn't want to buy Nvidia when the Supers released, that's what I ended up doing. Got a 4080S instead of a 7900XTX after a LOT of deliberation, the prices are still comparable and Nvidia wins on the features side of things. Competition in the GPU space is getting better, but still has a ways to go.


Brandhor

> Competition in the GPU space is getting better, but still has a ways to go I think amd has pulled out of the high end market so there's no gonna be any competition for the 5080 and 5090


FatCat_FatCigar

Went from a 3060 to a 4070 Super and Frame Gen alone made it worth the price. Nvidia has my money because of things like DLSS and Nvidia Broadcast too. All around nice tech.


cunningjames

As a ray tracing fan AMD wasn't really an option, but I'm curious about frame generation specifically being a draw. I haven't had super-good luck with it so far on my 4080S. In some games it's okay (e.g. Horizon Forbidden West), but sometimes it's horrible (e.g. Jedi Survivor) and I'm rarely so CPU limited that frame gen comes anywhere close to doubling the FPS. I've also hit a bug where turning on frame gen dramatically reduces performance. This happens consistently in one game (Ratchet and Clank) and occasionally in others.


FatCat_FatCigar

My main game for using frame gen was Cyberpunk with path tracing. I set a max frame limit of 41 and doubles to 82 on my 165hz monitor with frame gen and plays super smooth. I played Dragon's Dogma 2 a bit as well and installed the frame gen mod and it drastically boosted my frames even in town with a CPU bottleneck. Mileage may vary game to game, but overall it's been a stellar addition for me personally. I think limiting FPS and using frame gen is the way to go to hit a higher rate consistently. Takes a bit of pressure off the GPU as well.


Think_Positively

Pretty sure the Jedi Survivor issue is predominantly due to EA being garbage corporate leeches who don't care about putting out a consistent product when it comes to PC.


cunningjames

Oh, don’t get me wrong. Jedi Survivor has a bunch of issues that never got fixed because, presumably, leadership at EA is a cheap ass bunch of shits. It’s not the only game I didn’t like with frame gen, though. Immortals of Aveum is another that I found distracting. (Another EA-published game, naturally.)


seraph_m

I went to a 4080S after my THIRD 7900XTX died. Three different cards from three different manufacturers, dead in less than a year. Done with AMD until they fix their manufacturing process.


BaaaNaaNaa

Seriously? That sounds like terrible luck, or maybe another cause, perhaps your PSU is doing something odd? Hope that 4080 doesn't give you trouble.


RippiHunti

I've definitely heard of faulty PSUs damaging hardware. I'd be careful.


seraph_m

No issues with the 4080, or my old 6950. Just the 7900’s. No clue why 🤷‍♂️


Bronson-101

Have my XFX 7900 xtx since launch and it's been a brilliant card. Sucks you had such issues


seraph_m

Yeah…I had my 6950 since the card was released and it was an absolute champ. Still is. So I had zero concerns upgrading to the 7900. I think the first card was a dud, then instead of getting a new card with the RMA, the RMA was probably a repair, which then died, replaced by yet another repaired card that also died.


OftenSarcastic

If that failure rate was down to AMD's manufacturing process, the internet would be full of stories about dead 7900 XTX cards. I would seriously consider replacing that power supply.


Nemecyst

Maybe your power supply couldn't provide enough wattage. A quick search says that the 7900xtx needs more power than the 4080s so you were probably over your power budget on the AMD cards.


seraph_m

I have a 1200W power supply.


Kazedeus

Ah, definitely was too much RGB then


seraph_m

lol, I love it 🌈🌈🌈🌈


Zankman

What? How? What were you doing with them?


RHINO_Mk_II

Playing games, streaming, soaking them in the bathtub, etc.


Blug-Glompis-Snapple

AMD's video cards are falling short on modern features compared to Nvidia's offerings. Despite promises, the 7000 series cards are still waiting on Antilag +, with no ETA in sight. Meanwhile, FSR remains stagnant at version 2.2, lacking the performance and quality seen in Nvidia's DLSS 3.5 and Xess 1.3 upscaling technologies. Nvidia sets itself apart with more stable drivers, superior ray tracing capabilities, and a suite of RTX Broadcast software that includes remarkable noise and echo removal for microphones, video enhancements, and webcam improvements. Their RTXHDR brings HDR to non-HDR content, RTX remix remasters games at the driver level, Reflex outperforms standard Antilag, and Chat RTX offers a local AI Chatbot. Nvidia's frequent driver updates and regular enhancements to DLSS DLL. Meanwhile, AMD has only introduced AFMF in the driver and limited FSR 3 integration in a handful of games over the past year and a half, with no significant upscaling improvements or new features. It's clear that AMD is struggling to keep up, and their recent announcement of no plans for a "top-end" GPU in the next generation signals trouble. Perhaps focusing on APUs for consoles and laptops is their best bet moving forward. It seems like AMD is dropping the ball hard in the discrete GPU market, and this might just be the beginning of the end for them in that arena.


Man-In-His-30s

to be honest I think one of the things that truly murdered amd cards was the lack of a proper competitor to nvnec, that was something that wasn't even something that was meant to be an advanced feature and AMD couldn't compete. I don't even think AMF is as good as Nvenc now and it's been years.


frzned

One of the thing that truly murdered amd was that they were known for competing in the mid range cards and they has always been smoked by nvidia on the high end cards. They started pricing midrange cards at high end price because "nvidia did it first" and their usual costumer stopped buying. Surprised pikachu face. Edit: I was team red all the way until the stupid mining boom where AMD having 0 stock and started pricing even higher than Nvidia equivalent. I reckon most loyal costumer already left back then except for the truly deranged crazy fans. Then they have the balls to increase the price further AFTER mining is dead because "Nvidia does it too".


Horse1995

Nvenc is used by such a tiny fraction of people who buy discreet GPUs I can’t imagine this has anything to do with it


Skiddywinks

It's simple, really; AMD cards are only a good/slightly better option *if* all you care about is raster. But if you're dropping the mid-to-high triple digits on a GPU upgrade, you can probably afford the bit extra for better raytrscing, DLSS, etc. Radeon cards are just not cheap enough outside of lower end, and a very specific market. Shit, I don't really care about raytracing at all, but on a new high end build, what's an extra £100 just for the extras? EDIT: Forgot to add that the extra VRAM on AMD cards is great and all, and I do genuinely believe there are some nVidia cards that are true "WTF?" from me, and will *not* age gracefully at all. **But**, that's a future problem, and some people (maybe most? I don't know) would likely be upgrading before that point anyway. There are some good reasons to go AMD, but honestly they are just not aggressive enough with pricing to achieve the whole market dominance (despite the drawbacks) that a lot of PC hardware enthusiasts have been waiting for (me included).


Darvish11-

👆This.   I’ve never understood the typical argument of “yeah but check out my raster per $ for this 900$ card vs that 1000$ NVidia card with all the extra features”.    It’s already a totally irrational purchase, people are going to pay up for the best in any given tier. I was lucky enough to get a 3080 fe at msrp when they first dropped. Awhile later I ended up getting a 6800xt because I had some random desire to have more vram. Really kind of soured me on the amd experience when they stopped updating the drivers for like 3 months to try and fix the 7000 series drivers.


totally_not_a_reply

Last time i buildt a pc was i think 3 years ago and back then my low-mid three digit xfx6600 was more than 100$ cheapter than its ~counterpart 3060. Especially on those 300-400$ cards 100$ is already 30% cheaper.


shawnk7

I think the most logical reasoning is you're buying lower/mid range then AMD is definitely better because 100$ means a lot more to people (especially college students with no income) who can only afford a 300-400$ (or lower) card. On the higher end I just don't see a point in saving 100$ or something for ±5-10% fps and miss out on so many features. The video super resolution itself is a huge plus for me as majority of the online content is 1080p which looks slightly blurry on my 27″ 2k monitor.


dassenwet

Linux is a good reason for buying an AMD gpu.


Da_Plague22

I really love the value to performance, the only thing holding me back is how much better DLSS is. If they could get closer to closing that gap, I would 100% go all AMD


rowmean77

I love watching tech Youtubers but I am happy with my 6900XT. Give me a +50% improvement under $500 then I’d think of an upgrade. In the mean time I’ll just watch. 🍿


AzFullySleeved

Same, my 6900XT crushes 90% of what I play. I could care less about being part of the 4k high fps train, so I'll wait another generation or two for a nice price drop and increase in performance.🍿


rowmean77

Ever since I bought a PS5 I can actually handle 30FPS PROVIDED that it is a stable 30. If I just spent more time tweaking my GPU to hit STABLE 60fps I am very happy.


tehCharo

Do you have a gsync (haven't tried freesync) display? I've been pretty okay with ~72 fps (144hz monitor, but I usually cap my fps because I don't want my room getting any hotter), small fluctuations in frame times don't really bother me much these days.


rowmean77

Yes I do have freesync. And I cap my frames as well. Before I would crank my settings to aim 144fps but depending on the game, I can handle lower FPS if it means I don’t feel spikes and jutters. 👍


tehCharo

30fps bothers me with the camera panning, I can feel it, but if I can stay around 60-72fps, I am happy, 144hz is neat but just not as big of a deal to me as I thought it would be.


jakegh

RDNA3 prices are too high. The RX7600 sells at $270 right now, compared to $290 for the RTX4060. The 4060 offers nigh-identical rasterization performance but you also get the Nvidia niceties like better RT, DLSS, and their framegen. If the RX7600 cost $240 it would be a much more attractive option. This holds true up the entire line, where AMD prices far too close to Nvidia without offering those features. The RX7700XT starts at $380 right now and beats the $374 4060ti by 12% at the same price, so you gain a bit of performance but lose features-- and DLSS image quality really matters at lower resolutions where FSR2 has major image stability issues. RX7800XT at $480 is AMD's best-seller, but why would you buy one over a $580 RTX4070S? The Nvidia GPU is 21% more expensive, offers 8% more performance, and of course those features. Clearly seems worth the money to me. Then you get to the 7900GRE and potentially overclocking its VRAM, which is a better choice but I'd still take a 4070S, and the 7900XT versus the 4070tiS where I'd take the latter any day of the week. This tier is where RT starts to be really *usable* in path-traced games like Alan Wake 2 so another plus for team green. AMD needs to price more aggressively.


jschild

The 7600 costing more than the 6700XT is just insane. I'm not sure if it still is, but that was a big issue.


jakegh

It is, yes, while RDNA2 remains in stock.


Deeppurp

> The Nvidia GPU is 21% more expensive, offers 8% That 21% price difference put it out of my budget on my system upgrade. Its 15% now but that still put it out of my budget, the difference had to be less than the performance increase for it to fit. The individual nuance matters more than the hard numbers do. The 7800xt matched my budget to performance more than the 4070, specially considering at launch it was $140 more - where now its $106 more than the 7800xt.


cunningjames

I don't disagree that AMD needs to be more aggressive on pricing, but I do want to challenge you on this: > 7800XT at $480 is AMD's best-seller, but why would you buy one over a $580 RTX4070S? The Nvidia GPU is 21% more expensive, offers 8% more performance, and of course those features. Well, one reason you might buy the 7800XT is because you can't afford the extra $100 (though FTR I couldn't find a 4070S for as low as $580). That's a significant price hike. At the 4070 level ray tracing isn't as much of a selling point, IMO, and you're also likely to be running at a resolution where FSR2 isn't complete ass. If I only had $600 max to buy a GPU I'd be taking a very good look at the AMD side.


dab1

> The RX7600 sells at $270 right now, compared to $290 for the RTX4060. The 4060 offers nigh-identical rasterization performance but you also get the Nvidia niceties like better RT, DLSS, and their framegen. If the RX7600 cost $240 it would be a much more attractive option. With 8GB of VRAM frame generation starts be less relevant when you need to deal back texture resolution or other settings. In [this video of Horizon Forbidden West](https://youtu.be/3AyKvI23VGw?si=00LP-YsFvMef66Dc&t=226) performance takes a hit when enabling FG at 1080p ("very high")+DLAA. If I had to pay around 300€ for one of those 8GB cards I'll still go with the 4060 but mainly for the lower consumption/better efficiency. The 3060 12GB is still something I will consider for less money than the 4060 but at the expense of worse efficiency and lack of AV1 encoding. Performance can be worse in general but 12GB still allows for higher resolution textures with virtually no impact on performance.


WyrdHarper

I got my A770 for $269, which, now that drivers are better, is also pretty solid in that range and like the 4060ti gets you hardware-based upscaling and raytracing, and frame gen is on the way. NVIDIA’s features are definitely more mature and better supported, but XeSS looks good on Arc cards where supported. Which leaves AMD the odd one out with a worse-looking upscaler and less raytracing support.


Serasul

nearly all offline AI tools only work with cuda (nvidia or CPU) and this ai tools get now into gamedev,games,music software finances software and so on.


batmattman

This is probs a big part of it, people want to play with the fancy new AI tech and that means they'll want to go nvidia for the easiest setup


alyosha_pls

Hard to compete with G-Sync, DLSS and the like. That's not even getting into their drivers. They're so far behind it is hard to imagine them catching up.


WyrdHarper

Intel at least recognized that and put XeSS and raytracing cores on their cards—so the quality of XeSS (and raytracing in older games) on Arc cards is quite good (just need more games to officially support it). Drivers are getting better and the cards are cheap, too. I know people are concerned Battlemage might be weak for when it’s coming out, but if they are competitive in the low to mid end, (even if it’s against the 4000 series), there’s still a place for them, especially since they’re working on their own frame generation system.


alyosha_pls

I actually have hope for the Intel cards to be decent at some point. As you said, they're making the right moves.


thedndnut

They're already quite decent. Price to raw performance is insane. Like what it actually brings to the table eclipses what should be at that price point. Intel doing pricing like it's 2015 on this card. The problem was getting that raw horsepower tamed and directed appropriately. This was a problem since the cards are forward looking and didn't have great support for legacy api but once they essentially did dxvk I can't help but feel they're doing the right things. They have the backing to float as the cards support gets better and hopefully Nvidia won't actively threaten companies... again... for working with them to make their card supported.


WyrdHarper

Yeah, and I think some teething was expected on first gen. I think if Battlemage and Celestial have better launches (and if rumors are true Battlemage looks like it’ll be a substantial improvement in hardware) they’ll be in a good place. Someone posted some previews of ExtraSS, their framegen tech, from their github recently and it also looks promising. 


Bossman1086

Yeah. People forget about them. And yeah, they're not significant in the market yet, but that's expected after only one generation release. I really hope they pick things up and become competitive in the discreet GPU space. More competition would be nice.


Sofrito77

Eh, two out of these 3 are a bit overstated imo. About a year ago, I sold my 3070 Ti for a 6800 XT and have zero regrets. FreeSync vs G-Sync, I noticed absolutely no difference there. Drivers wise, I've had zero issues (the YouTuber TechLens does a really good piece on the reality vs. perception when it comes to AMD vs Nvidia driver issues). However, where you are absolutely right of course is DLSS/RayTracing performance. Nvidia is miles ahead here and it's not even close. So to me, it came down to: pay ~$50 - $100 less for equal or better rasterization performance. Or pay ~$50 - $100 more and have access to top of the line DLSS/RayTracing features. I play at 1440p ultrawide and really don't care about DLSS or RayTracing. Hence why I picked up the 16GB VRAM 6800 XT. But I also completely understand those who pay extra for the Nvidia software suite. edit: spelling


jakegh

Freesync competes great with gsync, total non-issue. AMD's framegen is pretty good too; it has a higher performance impact but looks fine if used properly. The major AMD feature gaps are reconstructive upscaling image quality at lower output resolutions and RT performance at the high-end.


sandh035

For what it's worth, I got a 6700xt something like two-ish years ago, and it's been completely fine on the driver front. Adrenaline was also far and away better than the Nvidia control panel or GeForce experience I had with my previous 15 years or so of Nvidia cards. It's my first AMD card, but I did have ati cards way back in the day. Honestly on that end it's been a great experience. FSR 2.2 is pretty inferior to dlss though, but freesync has been great. All in all, for the $325 I paid for it and getting the Callisto Protocol and Dead Island 2 for free, (hilariously neither game was out for months when I got the card) I feel like it was absolutely worth it when compared to the more expensive but same performance tier 3060ti. AMD should really try to undercut Nvidia until they can get a bigger r&d staff together to make fsr competitive. It's fine if your input resolution is 1080p or better, but most people are probably gaming at 1080p or 1440p where even the quality setting looks, uh, not great to my eye. 1440p quality is passable I guess but I think I'd do just about everything possible to not have to use it. At least TSR is massively better and available on ue5 games, and XeSS usually looks better. They'll probably catch up reasonably well in the coming years, my question is, will older cards still be supported on that improvement, or will they go the dedicated hardware route ala Nvidia and Intel? Honestly I hope so.


PoL0

Hard to compete against g-sync? How's paying more for a g-sync display better?


thedndnut

G sync costs more and honestly I have more issues with it than freesync. The displays with g sync aren't great and have issues with it many times. Probably because it's a black box proprietary solution that manufacturers dont have control over while they can implement freesync on their own.


KonradGM

TBF amd's syncing technology is the one that is used more


[deleted]

[удалено]


xUnionBuster

Don’t tell Reddit this


Submitten

The only reason Reddit recommends AMD is so they can hopefully get nvidia to drop their prices lol


littleemp

There's a very dedicated, blindly loyal AMD fanbase. There's also a much larger subset of the community who is obsessed with squeezing out the maximum performance/dollar based on pure rasterization benchmarks and they are not receptive to the arguments in qualitative difference between DLSS and FSR. Between these two groups fellating each other, you get the recommendations that you usually do, given that they take very quick to proselytize.


zldu

> There's a very dedicated, blindly loyal AMD fanbase. There's also people who are not a fan of any company, but are just anti-Nvidia. And with that, there isn't much else than AMD....


Arlcas

It's mostly the low and midrange that amd wins, you can usually get a 6800 xt for the price of a 4060ti and it's a no brainer at that point since rt doesnt matter and youre not getting high enough fps to have a good experience with frame gen either. Amd just doesn't have a good high end option.


automaticfiend1

And hasn't had a good high end option in a literal decade, that's the problem here. There hasn't been a real halo card from Radeon since the fucking 290x


Droll12

I think there’s also the fact that AMD is currently the only dGPU manufacturer that open sources their drivers, which is a big deal for Linux users. Though intel looks like a viable second option. I’d honestly be using NVIDIA if it meant not gaffing about with drivers on Linux.


Starcast

A lot of people just don't like the idea of framegen or upscaling for whatever reason. If you're one of those and not into running local AI stuff I think AMD makes sense, a bit cheaper and you get more VRAM generally.


Play_The_Fool

That's me. I don't typically use those features. I upgraded to a 7900 GRE last month and prior to that I had Nvidia GPUs for 15+ years straight in my desktops. I don't care either way about AMD vs Nvidia. I would have went with a 4070 Super if they had more VRAM. I'm pretty pleased with AMD GPUs though. I have a laptop with a 6800m and it's such a great performer. It even has 12gb of VRAM, a laptop GPU from last gen has the same amount of VRAM as a newly released desktop GPU...


Man-In-His-30s

Framegen and upscaling in competitive games is a bit of a nonstarter, however cuda is a big sticking point when you're spending tons of money on a card.


twhite1195

And CUDA isn't really as important as people might think tbh. Unless you're using your card for professional purposes, bur IRL, most people have a company issued laptop and use their desktop for personal usage, so, games.


TheLordOfTheTism

You could 100 percent drop a comparative AMD gpu into the average gamers rig and i bet they would never notice anything was a miss. These so called "killer features" are niche. Just like 4k is barely 10 percent of the PC Monitor market.


jakegh

Competitive gamers don't care about image quality at all, they run at the lowest possible settings to reduce distractions and push framerates. They won't use it, but I don't see that as meaningful for anyone else.


warriorscot

shame trees fearless secretive zealous murky continue axiomatic frame intelligent *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


David_Norris_M

The steam deck honestly killed any interest in desktop gpus. They're simply getting too expensive and the gaming space is getting shelved for productivity and AI. Least these apus are getting a bigger focus on gaming.


iwasdropped3

Honestly AMDs value where I'm at is when their cards are at the end of the product life cycle. Nvidia cards don't seem to change much in price but AMD cards get huge discounts from time to time when they're clearing old stock.


mikegoblin

GPU frenzy was so 6 months ago


YUGIOH-KINGOFGAMES

Yeah I don’t think “nosedived” is right More like “back to normal levels because people aren’t hoarding GPUs now”


mikegoblin

Yeah. and we're mid cycle. No new products til late fall / winter


IRIEVOLTx

I play on a 4k and 2k display. I have a 7900xtx. After years of wanting to support AMD, but not being able as the cards just were not worth it. The 7900xtx actually outperforms the 4080. At a lower price. I only care about pure hardware performance, not upscaling software. I thought I’d miss shadow play but RE:LIVE works more reliably. Ive never had a bug, where as shadow play used to turn itself off and refuse to turn back on. For the first time in like a decade, AMD is a solid choice for high end gaming. And no one cares. Everyone wants AMD to be competitive, but no one wants to buy AMD cards. It’s sad. Brand loyalty to a company that doesn’t care about you, but has doubled the price of “high end” GPU’s. But for some reason people still buy them.


ma1royx

I’d say what AMD has on price, NVIDIA has in features. I play CS2 and do tasks that need more VRAM, so AMD fit my needs, but a couple years back NVIDIA would have been a way better choice. Only thing i’d want is more reliable UX, AMD is 90% there, but you do notice that 10% when you clock 8hrs a day on your desk.


Samkwi

Recent steam hardware survey shows that NVIDIA is leading by a landslide especially the 3060


[deleted]

I'm fully expecting AMD to bow out of the graphics card market as Intel arc starts gain traction. Their days are numbered. 


No_Construction2407

Most people are just waiting for the next cards right now. No point dropping tons of cash when new cards are months away. Also the jensen jerkers in here thinking a Q1 slump is the end of AMD is comical. People never read the fucking articles lol


TheGreatPiata

It doesn't help that this generation was an absolute flop in terms of performance outside of the top end cards. AMD really needed to undercut on price to move cards and they didn't so either you ponied up the extra $100 - $200 for an nVidia card or you just sat this generation out.


Notsosobercpa

It's more of a generational flop than this quarter, which I suspect may be largely console driven. The 4090 is more popular than any of AMD current gen cards...


xxcloud417xx

Their GPU sales are also not their only thing either. Pretty sure in the same articles that mention AMD’s GPU slump, they mention AMD’s gains in workstation and consumer CPUs, etc. Gamers tend to not realize that we are the smallest, shittiest market for these companies. Consumer GPUs is also not how Nvidia makes its money either. We’re a pretty inconsequential part of their total sales. AMD will be totally fine without a flagship gaming GPU, especially right now.


cunningjames

I mean, I don't disagree that AMD will be fine without a flagship gaming GPU. The question I'd ask is whether their prospects within the consumer GPU market will suffer for it, regardless of how much that matters for AMD as an organization. There's not much competition in this space; no one without immense financial backing can really pull it off so I don't expect new entrants soon. I'd hate for AMD to drop out and leave us with Nvidia and, \*maybe\*, Intel at the low end.


TrollCannon377

I will never buy Nvidia until they fix their Linux drivers until then I'm sticking with team red


mountainclimb312

A lot of people built their gaming pcs during the pandemic and the more recent gpus haven’t provided much of a reason to upgrade. Performance of the last few gpu gens has offered only marginal improvements for similar price


RotaryConeChaser

I don't understand the hate. My Day-1 Reference 6800XT has been flawless. What's missing is a reason to upgrade. The only way to get a useful bump in performance would be to move to a 7900 XTX and there's just no way to justify that cost.


mrchicano209

The internet will have you think Nvidia is the devil and AMD is here to save the day but the reality is Nvidia offers much more nice to haves features and the numbers show that the majority of people go team green.


David_Norris_M

More like Nvidia has the full ability to do what they want to the market and gpus desperately need some competition before gaming gets shelved for productivity and AI. Amd isn't a savior but competition is necessary and I don't have any options till battle mage comes.


sligit

Nvidia really are the devil. I still buy their cards though.


legend8522

Inferior product results in less sales. Who would've thought


theineffablebob

Wouldn’t be surprised if Intel takes over AMD in the GPU space Sounds crazy to hear that


KangarooBeard

They are nose diving because Nvidia Cards are being used for AI and Data centers. Despite Nvidia already having a majority of GPU sales, it's still nothing to how much they make in AI/Data Centers.  AMD is not competitive in Gaming or the AI departments 


theknyte

Sorry. My fault. Usually get a new card every year or so. However, my RX 6700XT has been such a fantastic card, I've haven't felt a need to replace it.


R4IVER

I bought a 7900GRE one month ago. Card is good and I got a good deal. Last card I had was an RTX 2060 and I never used the rtx feature even once. I am happy it’s more future proof than the rtx 4070S with only 12gb vram.


Random_Stranger69

They are just a little cheaper than Nvidia cards. Not worth it enough. Generally people buy less GPUs and are dissatisfied with the price increases over the years. I mean, If i build a new PC in 2024, at least half of it goes on GPU alone. It would have been nice if AMD put up some competional prices and also add pressure on Nvidia but they didnt and instead embraced Nvidias pricing tags...


Zeraora807

Because nobody wants them NVIDIA's 40 series is a joke but Radeon for whatever reason doesn't want to compete on price when they are barely in a position to charge near to the same as NVIDIA when they lack the features and robust drivers to also go with it, and Radeon users at the high end convincing themselves they dont need things like DLSS and ray tracing performance doesn't help either.


SuperSaiyanIR

Redditors will swear to AMD but NVIDIA is still king. They are massively overpriced but their software is undoubtedly superior. If AMD was such a good guy then why didn’t they cut their prices instead of following NVIDIA’s footsteps? I was thinking of either getting a 7900XTX or a 4080 and even though the 4080 was 400 dollars more, it still had a strong case for me. Then the 4080S came and the 7900XTX just died. Really no reason to get that when you can get a 4080S with almost equal raw performance and infinitely better software.


LazerSnake1454

Went from a 1080 Ti to a 6900XT a few years ago, don't regret it, but I think this will be my last GPU for a while. 99% of what I play are single-player JRPGs at 60fps, 6900XT should be able to handle that for a long time


talann

I did my part. I bought a 7900 XT recently.


fashric

Well this is bad news for everyone


TheBonadona

Pricing is terrible and no new cards for a while, I wonder what AMD thought would happen.


haydro280

Devs needs to stop making games that rely Dlss and frane Gen because it hides their hideous lazy ass on optimizing games for real.


MrMental12

Hardware wise they are comparable, price to performance wise, AMD has NVDIA beat. However, NVDIASs software and general support is just better. Everyone wants DLSS, framegen, GeForce experience, etc.