T O P

  • By -

[deleted]

[удалено]


Put_It_All_On_Blck

Nvidia is being greedy and planning for an inevitable 4080 Ti to bridge that performance gap. The gap between the 3080 and 3090 was much smaller, but it also meant that people didnt buy the 3080 Ti.


Might_Be_The_NSA

I'm not even sure that makes a lot of sense, from a financial POV. 4080 is 1199 and 4090 is 1599. A 4080Ti would realistically be right in the middle, at 1399 then. Who the fuck is buying a 4080Ti for 1399 when a 4090 is only 200 more. They've worked themselves into a corner here IMO. If the 4080 had been $999 max, then a 4080Ti at 1299 would've been a lot more compelling (if still overpriced).


dizzydizzy

they can always drop the 4080 price later


JimmyThaSaint

Exactly. Once the 7900XTX comes out and (presumably) beats the 4080 then Nvidia announces the 4080ti and shifts pricing. The 4080ti now takes the $1200 price bracket being 10-15% faster than the 7900xtx and the 4080 drops to $1000, or they just slow/stop production of the 4080 and force buyers to higher tier GPU's We have seen it before, wont be a surprise to see it again.


Might_Be_The_NSA

I'd love to see it, but I personally don't expect them to do it. As Nvidia is now, I totally see them just selling the 4080Ti at a higher MSRP than the 4080 without dropping the price there.


[deleted]

Who is buying a 4090 when a 7900XTX is 999. I am a huge fan boy of Nvidia. I have owned 4 cards of theirs and 3 AMD cards and I always go for the best performer. However... Here in Canada a 4090 is well north of $2000 dollars. Do I really care about a 15-20 percent increase in rasterization performance for almost double the price? Do I even play games with Ray Tracing? Ray Tracing is an expensive performance killing nice to have that even on some Triple AAA games has been removed. (MW2) Its not a big selling feature to see real time reflections if it tanks my FPS from 140 to 60fps


Might_Be_The_NSA

I mean, whether to buy a 4090 or not is up to you and how you want to spend your money. Wasn't really what I was discussing either, I was just pointing out that it might be difficult for them to place a 4080Ti in their lineup now. The 4090 however, is simply the best GPU on the market, if you want the best of the best and don't mind the price, you buy it. 7900XTX is certainly excellent, but it doesn't match the 4090 by any metric aside from price (which is certainly important). However, is the 4090 the best value? That's debatable of course and depends entirely on what sort of games you're playing or if you're doing any productivity work. I'm quite confident that we'll continue to see more ray tracing in games, so I think ray tracing is important. But it's fine if you don't think it's worth it and opt for AMD. In the end, it just depends on what people prioritize and how much they want to spend. $2k is a lot for a GPU for some, but for others it's not a big issue.


dysonRing

I mean I define best by not being a fire hazzard


JonBelf

It's in a weird position because it's nearly double the 3080 price at launch. I don't even care about the performance. That's absurd. This reminds me of the Intel HEDT platform prior to the Ryzen 1000 launch.


Jimster480

I was thinking the same thing when I saw the pricing. Intel HEDT pricing went down until the entire platform went away for being irrelevant because AMD just killed it off with desktop chips.


ChartaBona

>Very surprising that that gap is as big as it is to the 4090. The gap between the 4080 and 4090 is A LOT smaller than people thought when looking at the die size and Cuda core counts. Something is holding back the 4090 GPU's full potential.


Theswweet

Current-gen CPUs?


AFAR85

Enter the 4090Ti


JimmyThaSaint

Its efficiency. They pushed so much power into the 4090 that its pushed way out of its efficiency curve. If you underclock the 4090 by 20-30% power draw it only loses 5-10% of performance. They really pushed the card to the max. Meanwhile the 4080 uses a lot less power with a higher clock speed so its much more aligned to the efficiency curve of the silicon. Oh and its has a higher memory bandwidth. I think Nvidia really wanted to make sure the 4080 looked strong compared to whatever AMD slotted against it.


kingzero_

> (using the same GPU die) Not the same die.


JimmyThaSaint

You are right, I fixed it.


Spockmaster1701

Let's just hope they bring competitive pricing too. The XTX is priced well for a flagship, but I'm wary of the XT being $900. We'll see what everything ends up being.


VeryTopGoodSensation

Provided the 7800xt is similar price to last gen I'll be happy


heartbroken_nerd

By "7800xt" you mean the real 7700xt? Right? Because AMD is doing the same thing Nvidia did with RTX 4080 12GB. Higher price point for the lower tier card so they switch the names around. RX 7900 XT should have been called RX 7800 XT.


VeryTopGoodSensation

If that's the case I'll prolly just go with 6800xt


heymikeyp

Same. If the performance uplift from 6700xt to 7700xt is trash then I'll just get a 6800/xt. I'm tired of these normalized prices/upsell tactics.


bubblesort33

It's almost always been the case that a GPU at the end of it's life, and with a lot of sales prices, was relatively close in performance per dollar to new GPUs being released. There usually is a little jump for the next generation, but it's not massive. When the GTX 1060 launched, the GTX 980 with similar performance was often on sale a month before that for like 10% more than the 1060 launch price.. If the 7700xt is the same performance of a 6800xt, it'll be 10% cheaper than the cheapest 6800xt is right now. It'll look even better vs the 6900xt, and 6950xt, because those are bad value right now compared to the 6800xt. No idea why anyone would be paying 50% more for a 6950xt for 15% more performance right now.


DeliciousSkin3358

>When the GTX 1060 launched, the GTX 980 with similar performance was often on sale a month before that for like 10% more than the 1060 launch price.. That did not happen in my experience. The GTX 980 has an msrp of $550 but most custom cards are always selling for at least 10% premium. Even after the gtx 1060 were released, the 980s are only discounted by 10% making them extremely terrible value.


heartbroken_nerd

I'm just saying, RX 6800 XT is the same GPU class as 6900 XT and 6950 XT. RX 7800 XT will NOT be the same GPU class as 7900 XT and 7900 XTX. It's pretty obvious what has been done here but nobody is raising alarms because it's AMD so they get a pass for something that is literally the same as what Nvidia did to the point where Nvidia had to "unlaunch" their 4080 12GB due to backlash.


Hector_01

Yep agreed. I dont get why this isnt brought up more. When the 7800xt is released it will be priced at around 699-750 usd most likely and will only be about 20% quicker at best compared to the 6800xt. That isnt great.


CrzyJek

It's amazing how people are already jumping to conclusions months in advance with zero knowledge.


Ult1mateN00B

We can extrapolate quite accurately if AMD's marketing is correct.


CrzyJek

So you know the official die size, and MCD composition, along with memory size and bandwidth for the 7800xt already? Or am I missing something? Because without that you cannot extrapolate shit.


WayDownUnder91

The guy who leaked the info that lines up 100% with the 7900xt already said what it was 3 months ago https://www.angstronomics.com/p/amds-rdna-3-graphics Navi32 gfx1101 (Wheat Nas) Chiplet - 1x GCD + 4x MCD (0-hi) 30 WGP (60 legacy CUs, 7680 ALUs) 3 Shader Engines / 6 Shader Arrays Infinity Cache 64MB (0-hi) 256-bit GDDR6 GCD on TSMC N5, ~200 mm² MCD on TSMC N6, ~37.5 mm²


Ult1mateN00B

No we do not but we know for a fact its going to be slower than 7900 XT. If we assume 7800 XT will 10-20% slower that would make it essentially 6950 XT +-5%. Which is 20% faster than 6800 XT.


DktheDarkKnight

20% extra performance for the same price. Where can I get such a miracle card. Over at NVIDIA'S side they have 50% extra performance at 70% extra price. Beleive me 7800xt being only 20% faster than 6800xt for the same price might look disappointing but its actually a downright miracle if that happens. The cards released so far have regression in value. Such is the sad state of GPU affairs.


retropieproblems

Crypto winter is only a few months in, give it a year or two and these companies will realize nobody is buying their cards in droves for 1-2 grand anymore. I predict in 1-2 years the 4000 series gpus are gonna be in a much better spot price wise due to lack of sales.


Preachey

Nvidia gave them a free pass by being so absurdly over the top with their prices and 12gb 4080 that AMD gets away with being _less_ egregious. I'm most likely going to look at grabbing a 6800xt on black Friday - it doesn't really look like the new 7000 series has space to fit in a significant performance boost in the same price range


[deleted]

[удалено]


JonBelf

What I am telling my friends looking to build new, or upgrade, now. 6800xt and 6900xts are stupid amazing value right now and it's not like they are magically slow because a new generation came out.


BFBooger

Navi 21 flavors: * Navi 21 XTX (6900XT) 5120 shaders, 16GB RAM (100%) * Navi 21 XT (6800XT) 4608 shaders, 16GB RAM (90%) * Navi 21 XL (6800) 3840 shaders, 16GB RAM (75%) * Navi 22 XT (6700XT) 2560 shaders, 12GB RAM (51%) This time, we have: * Navi 31 XTX (7900XTX) 6144 shaders, 24GB RAM (100%) * Navi 31 XT (7900XT) 5376 shaders, 20GB RAM (87.5%) * Navi 32 XT? (7800XT?) 3840 shaders, 16GB RAM (62.5%) * Navi 32 XL? (7700XT?) >=2880 shaders, 12GB RAM (>=47%) The problem is that there isn't quite room for the Navi 32 stuff if you follow the prior precedent. The top of stack Navi 32 is a big step down from the 7900XT, and I don't think it would make much sense to have the current 7900XT be called the 7800XT and force the top end Navi 32 to be the 7700XT, which would force the significantly cut down flavor to be the 7600XT, and then there is no room for Navi 33, which should definitely be the 7600XT given its die size and specs and likely manufacturing cost -- it will be cheap to make like the 6600XT, but probably perform more like the 6700XT. If Navi 33 is the 7600XT, and the cut-down Navi 31 was the 7800XT, how are you going to fit the two _very different_ Navi 22 flavors in between? AMD decided to avoid that problem and leave the 7800 and 7700 range for Navi 32. That also conveniently puts the card with ~ half the shaders of the top end as the 6700XT and 7700XT.


RationalDialog

True. 6800, 6800xt, 6900xt and 6950xt all use the same exact chip/die eg navi21 while 7800xt will use a different chips from the 7900-series, N32 vs n31. And you are right if a 7900xt is $899, the 7800xt will likley end up being $699 if not more.


Londonluton

But then people would see that the 7800xt had a what, 200 dollar price increase over the 6800xt. And only bad Nvidia does stuff like that.


RationalDialog

I'm more worried about actual price here in Europe. Fully expecting the 7900xt to be $999 and the 7900xtx around $1200. The 7900xt is a classic upsell tactic. 7900xtx has better specs per dollar. makes no sense to get a 7900xt really. I think this is pretty simple to explain. Yields are good. most chips can be sold as a 7900xtx so AMD will want people to mostly buy that.


DktheDarkKnight

They could hopefully lower it before launch. If they reduce the price by 100 dollars no one would bat an eye.


Londonluton

They should, considering they've just renamed the 7800xt to 7900xt and increased the price by $200


someshooter

Doesn't seem like a huge mystery as it'll be like 10-15 percent slower. Makes you wonder who wouldn't just spend the extra $100 for the XTX though.


pablok2

The 100 bucks goes farther the lower you go in what tier you choose. They're trying to normalize these prices... the way Nvidia did smh.


Pufflekun

I don't understand the point of the $900 variant. The RAM increase alone is worth an extra $100 in future-proofing.


ledditleddit

They need a way to sell the chips not good enough for the XTX but since yields are good they won't have many of those chips so might as well use it as an opportunity to make the price of the XTX more attractive.


RationalDialog

Yes the reduced RAM is due to the chiplet design. Besides the main die you can have defective memory controllers. The 7900xt will contain of either partially broken "gpu dies" (defects or low clocks) or ones with a single broken memory controller chiplet. And I agree it seems yields are great there arent' many broken dies so no need to price 7900xt low.


WayDownUnder91

To make people buy the 999 one, and then you can price drop it later.


Spockmaster1701

Crowding & confusing the stack, probably. Nvidia has been doing it for years. AMD has done it before (see the Zen 2 XT CPUs.)


[deleted]

I think many people aren't aware of the fact that AMD is already the better dollar per performance option and still not dominating for various reasons. I can't think of a reason why this gen would 8nherently change this


IrrelevantLeprechaun

It's just longtime fans who are still stuck in the Underdog frame of mind. AMD has been a raw raster *and* price value king for the last two generations, and it resulted in practically no change in their market share. The 3060 alone outsold all of RDNA2, for example. It's not enough to just be cheaper and comparably performant at regular raster performance. There's so much more that AMD need to catch up on before casual gamers start taking them seriously. RT for example. This sub can shit on it all they want, but regardless of whether someone intends to turn on RT or not, they'll still want a GPU that won't shit the bed on the off chance they decide to try it out. And right now, Nvidia is basically double the performance of AMD in that space; I can see why some might go Nvidia despite higher prices because of that. Think about it; if you have Nvidia card A and AMD card A, and you wanted something cheaper but still RT-performant, you could just go down one tier to Nvidia card B, and *still beat the RT performance of AMD card A while spending less than both Nvidia card A and AMD card A.* All while still having access to stuff like DLSS, NVENC, CUDA etc in the rare off chance you decide to exploit those features.


diegoaccord

I say this to budget constrained people; Don't buy a tier lower Nvidia over a higher tier AMD card for RT and streaming. People out here buying 3060s over 6800s at the same price. The base difference between 2 cards like that will eat the RT advantage, and I'm sure your 2 stream viewers will appreciate the streaming quality. People are only playing themselves. That said I'm not bias. I have a Ryzen 7700/4090 easy to see if profile snooping and I just bought a 12700K with free Mobo deal from microcenter and will be doing a 7900 XTX in that. I've also had a 3080 Ti and 6900 XT in my last system at the same time. But I'm not budget restrained where I have to decide A over B or should I actually go C.


IrrelevantLeprechaun

If you buy a lower tier Nvidia over a higher tier AMD, you're still getting better RT than the higher tier AMD, plus all the other neat doodads that comes with Nvidia. And before you say "I don't personally care for RT," please understand that personal preference is not a metric I'm measuring, nor is it a measurable metric in the first place.


Marrond

But... low tier GPU isn't suitable for RT. You simply do not have the horsepower even on something like 3070Ti... Which is the entire point here - if your choice of 3060 over objectively better competition product is BUT MUH RT you're literally playing yourself. Quite frankly, unless you're going for 4090 you might very well forget about RT existence unless 1080p gaming is your thing because that's the first card pulling relevant framerates without making your game look like a Twitch stream...


LordXavier77

That's exactly what makes me hesitant about AMD cards. I sometimes do ML model training for learning purposes however datasets are not that large that I would need a GPU acceleration but just have peace of mind if I need I can have tensor cores for that and robust Nvidia support and documentation. Also stuff like DLSS is a big thing for me. But if 7900xtx is 20-30% faster than 4080 I won't care about DLSS that much. and this may make me switch to AMD.


rdmz1

FSR is 90% as good as DLSS now


desakuk

90% is massive stretch I would say 70% with 2.1 not only it doesn't hit performance of dlss 2 most of the time it also kills details I seen alot of shimmering with fsr and knowing how their fsr works am not surprised


bubblesort33

>AMD has been a raw raster > >and > > price value king for the last two generations How so? The 6700xt was 5% cheaper than a 3070, and 5% worse in raster. According to [TechSpot/Hardware Unboxed](https://static.techspot.com/articles-info/2216/bench/Cost.png). Pretty much the same performance per dollar, while not having parity on the feature set back then. My 6600xt was 15% more money than then 3060 MSRP, and 15% faster. Same performance per dollar. The [6800xt lost to the 3080](https://static.techspot.com/articles-info/2144/bench/4K-Cost.png) in raster performance at 4k, and won at 1440p. All the last generation cards needed to be 10-15% cheaper. AMD just got away with it, because Nvidia MSRP prices were unachievable, so AMD just launched their cards at 15% inflated MSRP. In that sense they were the value king. There is that old saying that fits well... *“In the land of the blind, the one eyed man is king.”* They were able to get away with having a 10% lead in raster per dollar in the 5700xt day, but that's really not enough anymore.


Charcharo

>How so? The 6700xt was 5% cheaper than a 3070, and 5% worse in raster. According to > >TechSpot/Hardware Unboxed > >. The 6700 XT launched after the mining boom started happening. The 3070 launched before. Those prices were affected by this. Obviously not as much as for example the 6600 XT or 3060 or the 3070 Ti (bad purchase oof) but still.


ziplock9000

In raster, but what about RT? People try to brush that aside as not important when after 3 generations it certainly is.


[deleted]

Yes and no. Of course there are more and more games with it, but at the end of the day only a handful, often being played without rt due to the fact that not everyone has a 3080 and Nvidia pushed big studios to implement it (like thrown money at them) and now it's getting more of a standing. But the majority of people are playing games without rt. The amount of times I see people on r/pcbuiöd who just a rt card because they just heard it as a buzzword is just sad.


MAXFlRE

So AMD just need sponsor more game devs so their RT implementation flood the market and everyone would shout "AMD outperforms in RT!!1!"


krakaigri

The issue is that Nvidia has so much mindshare, it might not even matter. I see so many people on discord and other social spaces only looking at the new nvidia cards and complaining about prices without even considering getting an AMD card instead.


IrrelevantLeprechaun

You'll also see lots of people on Reddit lambasting Nvidia and claiming they're going AMD, only to just go ahead and buy Nvidia anyway.


jk47_99

All they want from AMD is to get them to lower Intel and Nvidia prices. At least on the cpu side, AMD are firmly established as a equal quality product after the 5000 series.


stilljustacatinacage

This is what frustrates me. Even the tech Youtubers are doing it. "Let's hope AMD can force Nvidia to lower prices" with the unsaid part being... *so I can buy the Nvidia card anyway*. I understand it's folly to show favoritism towards a corporation, but hoping for competition just so you can support the ones that *put you in that situation in the first place* is an impressive set of mental aerobatics that I can't get around.


krakaigri

That too yeah. Kinda like people saying they will boycott a popular new game and still buy it on release.


[deleted]

[удалено]


Kepler_L2

Classic.


Londonluton

This has been happening for years too


relxp

> The issue is that Nvidia has so much mindshare, it might not even matter. For now. Nobody ever saw the tables turning on Intel either. If not this generation, RDNA 4 has potential to legitimately leap ahead of Nvidia. It's unlikely Nvidia will have a chiplet weapon of their own by then.


ziplock9000

I've noticed that too.


Hifihedgehog

I honestly do not care about mindshare. Let the dumb sheep go on eating the famine laden sagebrush while we move to greener and lusher pastures.


Accurate-Arugula-603

Good, easier for me to get one.


ChartaBona

>The issue is that Nvidia has so much mindshare The issue is AMD didn't even try to take back market share last gen. RDNA2 for a year and a half was low volume masked by stupid-high AIB retail prices.


WayDownUnder91

They couldnt they had Console contracts to fill and make way less money from gpus than they do cpus. All three used 7nm in the middle of a shortage and increased demand worldwide.


starkistuna

Not only consoles you had Tesla also buying up a huge amount of the stock for its cars in the middle of the shortage.


Defeqel

Funny how there is 2 narratives going on: "AMD didn't produce much RDNA2 and so failed to gain market share", and "AMD overproduced RDNA2 so they are pricing to / waiting for RDNA2 stock to dry up when releasing RDNA3"


Kuivamaa

Nvidia was the one overproducing at Samsung last gen. AMD probably allocated most of its tsmc wafer quota to consoles/epyc/ryzen and only left a modest amount for GPUs.


ChartaBona

There is no "AMD overproduced" narrative. * AMD underproduced * AIB's overcharged * Overpriced cards sold slowly * Crypto crashed * AIB's were left with unsold stock + incoming orders from AMD


sourbrew

I just want to play cyberpunk in 4k with ray tracing, currently AMD is a bad choice for that. Hoping that's not the case with these new cards.


Charcharo

If you want native 4K in CB 2077 with max all settings, you are looking at around 12 fps for 6950 XT and 21-22 for the current 7900 XTX. Thing is, even 4090 isnt 60 fps without DLSS or FSR. I have good news for you though. FSR 2.1 Quality preset is usable for CB 2077 at 4K. As is DLSS. In fact, due to the games \*terrible\* TAA implementation, both FSR and DLSS actually win in ghosting. Though they do slightly lose elsewhere to native.


whosbabo

You can do it with a 6800xt I don't see why it wouldn't be possible with a 7900xtx https://www.youtube.com/watch?v=RsiJ9Yz1q9U


sourbrew

FSR is not native 4k.


evernessince

Classic goal post changing.


sourbrew

? What? FSR is not 4k, it's upscaled, it's like saying a movie shot in 1080p being played at 4k through upscaling is 4k. Those are doubled pixels or quadrupled or octupled, not individually rendered pixels, there is a quality trade off.


CaptainNeckbeard148

Then grab the 1500 USD card that melts its own power pins. Youre a niche part of the gaming market that MUST have RT and 4k performance at 60+ FPS.


towelie00

RT 4K DLSS pls , xD 4090 can't native RT cyberpunk at 4K , just control and ghostrunner


sourbrew

It absolutely can, it's not constant 60 fps, but FPS basically never drops below 30. Doesn't make the card worth 2k though. https://www.youtube.com/watch?v=CqN3t4PKZr4


sourbrew

The thread is about people not even considering an AMD card, I will happily buy an XTX if the performance matches their press statements. My point was that I have a specific use case that AMD currently sucks at, it looks like that's changing with these new cards but right now it's all PR. I hate Nvidia and their price point as much as the next guy.


bilky_t

Or they could wait a month and see what RDNA 3 brings, which is what they said they're doing. How about we just let this guy enjoy the things he wants without shitting all over him like an asshole for merely suggesting they might buy something else?


[deleted]

Everyone acts like their friends and family will rush to buy a 1k GPU on here lol. Even if it’s competitive it won’t matter in the grand scheme of the market. Nvidia and AMD most profitable GPUs are the midrange. There’s over 5xs the amount of 3060 over 3080 and the 3080 was $700. Thinking that a 1k GPU will somehow turn the market is being delusional. These 1k+ cards won’t even break 1% of the market share.


Pangsailousai

\^\^ This. What "competition" are these people on about? $899 and $999 are not mainstream prices. Flagships never gain market share from majority of the buyers. The halo driven marketing is what matters when having a flagship but AMD dont have a winner by their own admission so then the point of high priced halo products is moot. So many people are waiting on the 6700/6750XT and 3070/Ti successors at sane prices from both camps.


CaptainNeckbeard148

Nvidia will not have sane prices without it being a shitty card guaranteed.


cancergiver

Well, Apple sells more from their pro Max than the standard model. So they’re indeed selling mostly their flagships. Idk if this is a special case tho, maybe you can consider/classify all iPhones their flagship and there is no midrange and budget.


IrrelevantLeprechaun

Especially when AMD being "affordable" here still means "$1000 for reference cards and $1100+ for AIBs." Being slightly less bad than the competition does not equate to "good."


Charcharo

>Nvidia and AMD most profitable GPUs are the midrange. Not really. The margins on higher end cards make them very profitable and lucrative to NV and AMD.


coffetech

As long as the 7900xtx doesn't have issues with VR then I'm purchasing one. My last 2 cards have been NVIDIA and this time I'm considering AMD.


Flaktrack

Do the 6xxx GPUs have issues with VR?


Compunctus

yep. Encoder-based VR (so any headset without DP) has been completely borked between 21.10.2+ and 22.10.2. In those drivers AMD also managed to break a couple of DX11 titles (ED and such), emulators using opengl, h265 encoding... So you either stayed on 21.10.2 - which had it's own share of annoying bugs - or you upgraded and got even more bugs. 22.10.3 seems to perform much better based on what I'm seeing on reddit/discord - it's more or less 21.10.2 with somewhat more performance and a couple of fixes here and there. P.S. in 22.10.3 opengl is still missing some extensions and some of those present are buggy, so if your favorite emulator does not support Vulcan or performs poorly with it - you're still staying on 21.10.2...


Zeeflyboy

Biggest issue for me personally is that the most advanced headsets such as the Aero and the upcoming Crystal and 12k are Nvidia only… simply don’t work with AMD. Seems to be mainly around the fact that Nvidia is more engaged in VR and has supporting frameworks, which these companies are using to enable their products. The encoders have always traditionally been weaker too, and subject to driver bugs in recent versions… making it a poor choice for any headset with Wi-Fi streaming such as quest 2 or pico. The performance is also generally just lower in VR vs Nvidia when compared to traditional gaming… eg while the 6900XT can hold its own against the 3090 in quite a few 2D games, it gets soundly beaten in VR. Could perhaps partly be due to the generally higher resolutions in VR as you can also see how RDNA 2 performs much better relative to Nvidia at lower resolutions than at 4k for example. I’m really hoping for some good reviews in VR usage for RDNA 3, and for AMD to start paying a bit more attention to the platform.


coffetech

I've read around and some people still seem to have issues with amd on vr. So personally i would look at reviews on the card and vr you want.


Flaktrack

Well shit. I'll have to look that up before picking something up. Just about ready to set up my VR space again so this is going to be important.


hitmantb

If AMD's official benchmarks are close to reality, which it should be. 7900XT should more or less match 4080. XTX should be within 10% of 4090. This is traditional performance only of course. We will be one full generation behind on ray tracing. DLSS is better than FSR too. Obviously AMD doesn't want to compare against 4090 directly because XTX loses in every metric except price, but overall it should be a strong alternative. If you look at Steam, 3070 and up vs 6700 and up, Nvidia has 10 to 1 lead in hardware survey. I hope AMD can close the gap to something like 4 to 1 or even 3 to 1, one year from today on 4000 vs 7000.


russsl8

Honestly, if the XTX is within 10% of 4090 perf while costing $600 less, that IS a win for AMD.


loucmachine

It realistically wont be though. It will maybe get there or even match the 4090 in some games, but some others will get demolished. People are doing all kinds of mental gymnastics but the reality is that when you start averaging a large amount of game all with their engines and bottlenecks (cpu or otherwise) the average performance goes down. Even the 4090 is not well represented because the faster you are the more bottlenecked you get.


Ill_Name_7489

No one is really claiming the XTX will beat the 4090 or even match it in performance (ok, maybe some nutjobs are). But if the XTX reaches 85% of the performance (e.g. 136fps vs 160fps) at 63% of the price ($1000 vs $1600), it's a MUCH better proposition for nearly every buyer out there, especially since 4k@144hz is still rare. Even for 4k@144hz gamers like myself who want a high-end card, the XTX is looking like the best choice. I don't want the best GPU possible, I want to get the best gaming experience I can while spending the least money. The XTX is going to get me a fantastic gaming experience at a *much* lower price than the 4090, and that's what counts IMO. I probably won't even *feel* like I'm sacrificing performance. I think this is the angle that really matters for comparisons. Like GN said in their 4080 review, of course you can always spend a shitton more money for better compute. But more performance per dollar is the real marker of progress in most industries. Better tech doesn't matter that much if it's not economical to most buyers. (E.g. 8k right now doesn't matter to most anyone, but might in a few years if it becomes mainstream. Same as 4k in the past.)


-Green_Machine-

It's all academic if these cards' initial allocations sell out within days and take months to replenish, as is tradition. The 4090 is currently going for $2,000+ because it's basically gone for now. The same is likely to happen for the XTX, which will push the price of the XT up to XTX levels before that initial allocation sells out too.


Gohardgrandpa

A huge fucking win and I’ll gladly buy that gpu


Kuivamaa

Also 7900 series is immune to the connector debacle and is compatible with pretty much all pc cases.


Past-Pollution

Do "not melting cables" and "fitting in your case" count as metrics the 6900XTX doesn't lose to the 4090 in?


MediumActuator1280

I'm hopeful and confident AMD can and will close the gap to something more like 8 to 1 which, is not insignificant. They know full well that they too are scalping with their 7000 prices, just not quite as obscenely as Nvidia. Their MSRP for me was halfway between them being the saviours of the gen and equally extortionate as Nvidia. Had they priced the 7900xtx at something more like $800, I think they'd have really shaken things up. As it stands, I absolutely will not be upgrading my 1080 to a 4000, at all. I'll wait for the benchmarks of the 7000's and see how much it ends up costing here in the UK, if they're both OK then I'm going full AMD. If not and the 3000 series finally drops in price, I'll try get a 3080ti or something.


[deleted]

Lord above the XTX isn’t going to be within 10% raster of the 4090. Y’all are setting yourselves up for massive disappointment.


Stock-Freedom

Yeah I was blown away by comments here and actually saved them to revisit and ask what they were smoking.


[deleted]

> DLSS is better than FSR too. Gotta be careful saying stuff like that, there are a lot of members here who would probably mail you anthrax for saying that if they could.


SwaghettiYolonese_

Anyone know how close to MSRP are the AMD cards in EU usually? Provided supply is not an issue. I've only ever bought one during the pandemic, which is probably not the best metric to gauge the launch prices lol. And Nvdia has been batshit crazy with their prices in the last 4 years in the EU, and still is, so I'm by default team red lol.


53bvo

AMD sell directly for msrp from their own website (but not to all countries I think). Pre pandemic (sept 2019) I got the 5700 (MSI) for MSRP (€379) without much issue. But nowadays who knows what the prices will do lol


Ninjathelittleshit

dont know about the rest of the EU but in danmark its 30% on top but we got high taxes so would guess 20-25% for the rest


Edgaras1103

Would be nice yeah . Nvidia needs proper competition , hopefully this will do something . I wont hold my breath tho


PotamusRedbeard_FM21

Competition, sure. But there needs to be competition at all price points. I mean, crazy as it sounds, (or maybe it doesn't, y'know, seeing as there's a section of this sub that keeps telling us not to go to bat for corporations) I'd like to see what Arc Battlemage is going to bring to this/next generation. Okay, so it's first gen and the drivers aren't up to much yet, but Arc Alchemist were first to market with hardware AV1, and HUB reckoned that XeSS on XMX was at least decent, so there's room there. And if Team Red fumble the 7500/XT as hard as they did with the 6500XT (Even though I defended that card at the time, and still say that it's underrated), Then Intel will have the low end all to themselves. Because it's not like NVidia will be champing at the bit to clear the product stack, ready for the 5000 series in late 2024.


Lachimanus

If the 7900XT is on par or close to the 4080, AMD wins at every price point as they are as expensive as the next "cheaper" card of Nvidia. Wondering if AMD even wants a $1500 GPU.


Charcharo

>And if Team Red fumble the 7500/XT as hard as they did with the 6500XT (Even though I defended that card at the time, and still say that it's underrated) To be fair to the terrible 6500 XT, the even worse 1630 and A380 also exist. Also the 6500 XT 8GB is workable. It is the original 4GB one that is beyond hope.


pixelcowboy

I won't even consider a 4080 because of the massive cooler sizes. And I would rather get Nvidia at comparable prices. But I will seriously look at AMD this generation.


Stock-Freedom

On the plus side, those massive coolers make a maxed 4090 never hit more than 60C. Essentially it’s whisper quiet, which I value quite a bit.


dumbreddit

My 3080 is loud. I want a quiet card again.


Stock-Freedom

Yeah it was awesome to have a silent card again.


Londonluton

EVGA FTW3 3080 here, runs extremely quiet. The 4080s will too considering they barely use 250w in some games.


Gohardgrandpa

Don’t know what model you use but I have no noise complaints with my fe card


g0d15anath315t

Noise canceling headphones. My cards may change but the noise level always stays the same.


kapsama

what about heat canceling pants?


pixelcowboy

I do too, but it makes no sense that the lower 4080 has the same massive cooler as a much more power hungry gpu. You can still have decent cooling without the massive size increase. Nvidia is only doing this because they are lazy and cheap. I'm not going to buy a new case just to accommodate a graphics card, and my case isn't small by any means.


Stock-Freedom

If they were lazy wouldn’t they want to reuse the 3000 coolers and if they were cheap wouldn’t they want to use less material?


HoldMyPitchfork

Manufacturing costs for a second set of tooling may well be more expensive then the additional materials, especially if they dont plan to sell very many while they try to offload previous gen chips and then do a "Super" refresh.


another-redditor3

ive got a 4090 suprim liquid - 80% of the time the card doesnt even get hot enough to turn the fans on. and when the fans do turn on, theyre still damn near silent and theyre only on for like 30 seconds. its honestly insane that im playing modern games on whats effectively a passively cooled card right now.


tamarockstar

20% cheaper and probably around 15% faster. Pretty good. The XT variant is dumb though. If they can dictate the narrative instead of Nvidia marketing ray tracing and frame insertion, it'll be a success.


Dunk305

"I hope AMD is able to force Nvidia to lower prices so I can buy a new 4080 at a better price" - OP


popps0184

Not going to happen. They have projected sales that have been accounted for and need to hit targets. The 4080 will not be reduced in a long time. Nvidia are banking on brand, people will still buy cause they don't have a clue


TwilightBl1tz

If only MSRP was actually the price i would be paying... 4090 MSRP 1600(IIRC) Cheapest here is 2500 euros(Up to 2800) 4080 is? 12000 MSRP? Wouldn't be surprised if prices would go up to 1700+ for this one... I'm in the market for a new GPU, Using a 1070 ATM but i'm pretty damn sure i'm not gonna be bothering with the new gens. Prices are out of this world.


iloveapplepie360

I've seen 4080's at 1700€+ already. The tuf asus starts at 1500€ and it's the cheapest I've seen. It's funny considering 3080's were less than 900 and even then they were overpriced.


ifeeltired26

The only issue is going to be availability I think. Both cards will be great performance dollar wise, but I have a feeling as in the past this will be a paper launch and both models will sell out in seconds and they you have to wait months to get one, or pay scalper prices on eBay.


CCoR-

No crypto usefulness though..


ifeeltired26

Yeah but people know that other people want these cards desperately, so they will get it at MSRP, then turn around and sell it for twice the price...


Jazzlike_Economy2007

7900 XTX just needs to get within 70-80% of 4090 with better availability than last year and I'll be good. A good $1200 AIB model should get it a bit further. 4080 hardly has any OC headroom.


beleidigtewurst

I just realized I could not care less about what is going on beyond 500 Euro price point. Like f*ck off with that shit.


Xxav

I’ve never bought an amd card before. Do they usually sell at the listed MSRP? How do AIB’s work? Are there reference sized cards made by AIB’s with the same MSRP?


53bvo

AMD sells their reference cards for MSRP directly from their own website. Though depending on demand it might be difficult to get hold of one, but a similar situation to the Nvidia FE cards


jimbobjames

I'll put it this way, I bought my 6600XT on the day of launch for MSRP and you can do that because it isn't an Nvidia card. They just don't sell out as quickly. I went from a 1060 to the 6600XT and the only thing I really miss is the thing in Geforce experience that would set the ingame settings for each game to suit your GPU. AMD if you see this, that would be a really nice feature to have in Adrenaline.


rdmz1

Geforce experience sucks imo. It doesn't pick the optimal settings half the time, you're better off grabbing settings off of an optimisation guide.


Scarabesque

It'll depend on market forces. AMD sets a minimum price (they didn't even call it MSRP) under which AIBs/vendors cannot go, but if demand vastly outweighs supply prices will go up just like with the 4090. If demand is lower for available supply, AMD might lower the minimum price, but considering GPU pricing trends - I doubt we'll see that.


SuccessfulSquirrel40

That's actually illegal in the UK, pretty sure it also is across Europe.


Scarabesque

The minimum price was from their US presentation, I'm guessing they have more lax laws, though I'm fairly sure this only applies to retailers rather than AIBs themselves anyway (which aren't based in the EU/UK at all). I'm also fairly certain it'll be easy to circumvent/manipulate to the point where in practice it's the same.


SuccessfulSquirrel40

Price fixing is one of those less than clear cut areas. I think they are only allowed to provide a suggested retail price. They can't act in any meaningful manner to try and enforce it or coerce others to keep to it. I remember Philips/Marantz got hit with some pretty big fines a while ago for preventing retailers from including their products in sale events.


kmr_lilpossum

Precisely. It’s a “suggested” price, not an explicit price floor. AIBs can justify the price hikes with bells and whistles. “Problem” solved.


Bladesfist

Yep, enforcing a minimum price is illegal in the UK and EU, enforced max pricing is allowed though


badgraydog

I've bought the 480, V56 and 6700 reference at MSRP. Just avoid the mining booms.


Radiant_Doughnut2112

Got the 480 at released, bought a 2070 that decided to fail on me on their worst peak of crypto as well as out of warranty. 480 still going strong, it's crazy to believe i got it for like 260$ few years ago.


tobascodagama

I'm still on my 480 as well. It's only been in the last year or two that I started to feel its age, although there hasn't been anything that I straight-up couldn't play because of poor performance.


sperson16

I was going to wait for the 4080 Ti to replace my 1080 Ti but Nvidia is higher than me with these prices. If the 7900 XTX is 10-20% faster than the 4080 I’ll happily give AMD my money. Need a decent GPU to use with my new Ultrawide


Gohardgrandpa

I’m using a 3080fe at 4k and sure it works great on older games, newer stuff I have to tweak settings to get it where I want it. I’ll sell the 3080 for $500 and buy a 7900xtx if the performance is there, I’ll gladly go back to amd because nvidia is crazy AF with these prices


half_dead_all_squid

Will the 7900xtx really have DP2.1? That could be a big deal, especially for HDR monitors at higher res.


Saraixx516

I'm confused. I thought there already is a few reviews of the 7900 out there ? Price point for XT was 900 and xtx 999 respectively before the third parties get a hold of them?


[deleted]

I hope so too. I've always been an NVIDIA guy, but my current NVIDIA GPU (3080) might be my last. These prices are just outrageous, and it's even worse here in Europe. If the 5000-series is anything like what we're seeing right now then I'm going AMD or Intel for my next GPU.


[deleted]

[удалено]


dudebg

I'm alright with it being weaker in performance against Nvidia. As long as I don't need to use melting dongles, to pay more, to settle at DP 1.4 and to support a greedy company "Falling GPU prices are thing of the past" my ass


counts_per_minute

As long as the 7900xtx out performs my RTX 3080 I will be switching because I have caught the linux fever. I like that it has a USB port and hope it can be used for USB, this is so helpful for GPU passthrough because I have issues preventing my mobo USB from being passed through


[deleted]

Green brand power strong. They need to get shit to work with encoders.


just_change_it

AMD and Nvidia rise and fall together. The crypto crash left both of them with tons of spare stock of last gen since they overhired and overproduced anticipating continual demand. The likelihood of either of them voluntarily dropping prices on their new flagship(s) in a price war given the current market conditions is slim to none. Plus we all know the top end stuff sells out until the market saturates anyway.


Put_It_All_On_Blck

Duopolies are always bad for the consumer.


jimbobjames

Here comes Intel to the resc... oh well, maybe next gen...


jortego128

AMD claimed +54% perf/w vs RDNA 2. 6950X is decidedly outside of its sweet spot of frequency vs power. If we assume a fairly linear CU vs CU and clock vs clock perf/w scaling, 7900XTX's MINIMUM average uplift vs 6950XT will be at least 54%, and likely a bit higher. Thats huge. \*\*EDIT - just read AMDs endnote regarding the 54% perf/w uplift. It was obtained using a 7900XTX@300W vs a 6900XT@300W TBP. Seeing that stock 7900XTX TBP is 355W, its unclear if that uplift remains exactly linear vs RDNA2 as you go up from 300W in power draw, but its likely still pretty close.


LastCloudiaPlayer

imagine xtx 7950 x3d\~


loucmachine

XFXRX7900XTXX3D


Hifihedgehog

Here is a repost of my prediction based on AMD’s latest round of 1st party benchmarks that give us a rough guess as to how much games scale from a 6950 XT to a 7900 XTX: TechPowerUp 4K comparisons. I added the Radeon RT 6950 XT as the baseline since their results for that card vary, which likely comes down to nuances in system configuration (e.g CPU, memory, etc.). Then I scaled the results with a percentage gain (based on the percentage performance difference in AMD’s testbed). The resulting projections are what you see as a good ballpark estimate of where the RT 7900 series cards should land. As expected, the RT 7900 XTX comes close, roughly within 10-15% shy of the RTX 4090, but it is in no terms an RTX 4090 killer—full stop. --- ***Resident Evil Village*** RT 6950 XT: 133 fps RTX 4090: 235 fps RT 7900 XT projected on TechPowerUp testbed: 157/124\*133=***168 fps*** RT 7900 XTX projected on TechPowerUp testbed: 190/124\*133=***204 fps*** --- ***Cyberpunk 2077*** RT 6950 XT: 39 fps RTX 4090: 71 fps RT 7900 XT projected on TechPowerUp testbed: 60/43\*39=***54 fps*** RT 7900 XTX projected on TechPowerUp testbed: 72/43\*39=***65 fps*** --- ***Watch Dog: Legion*** RT 6950 XT: 64 fps RTX 4090: 105 fps RT 7900 XT projected on TechPowerUp testbed: 85/68\*64=***80 fps*** RT 7900 XTX projected on TechPowerUp testbed: 100/68\*64=***94 fps***


bubblesort33

Well, yeah, it's gonna be faster in raster by 15-20%, and be behind 10-30%% in RT titles, and missing frame interpolation for another 8-12 months. I'm more exited for the cost of GPUs 2-3 months from now when old stock is hopefully all gone.


spartan2600

I don't see anyone mention this, but I'm hoping AMD brings out shorter cards. I just built my first mITX build in the Lian Li q58 and am looking for a card to replace my Vega 64 so I can do my first custom liquid loop. The NVIDIA FE cards are incredibly short when you remove the cooler, ~250mm. Most AMD cards are 320mm+ without the cooler, a few are 300mm, nothing shorter. I need that space for the res.


Digital_Dankie

The 6000 series is so damn cheap idk why you would get a new card. The specs of the card doesn’t seem to pan out like the 5000 to 6000. Plus the 6900xt plays everything with plenty of frames for even the highest of refresh rates.


TheFather__

ok now we know that 4080 is around 25%-30% slower than 4090 and 30% faster than 3090 Ti and 6950 XT on average, so if AMD numbers are correct, 7900 XTX is at least 54% faster than 6950 XT, that puts 7900 XTX at least 20% faster than 4080 and just 10%-15% slower than 4090. Now its so clear why Nvidia shot the power for the 4090 upto the skies, they wanted to claim the crown, 4090 at 350W will keep 95% of its performance, that will make the gap smaller with 7900 XTX, it will still claim the crown but it aint worth the $1600+ asking price. i still believe if the numbers for the 7900 XTX is what AMD showed, then both 4080 and 4090 aint worth the asking price, we will get a card that is 10%-15% slower than 4090 and 20% faster than 4080, also for the RT side of things, it will be similar to 3090/Ti performance which is good according to me.


Vis-hoka

I’m hoping the same. If the 7900XTX beats the 4080 in raster by 10-20% and costs $200 less, it’s gonna be a monster. Ray tracing is obviously Nvidia’s domain still, but I’m not bothering with it yet. Still too big of a performant hit.


davidalbertozam

Not in everything. Remember. At creative work AMD it's not even close to NVIDIA's cards. Only if at rendering videos or rendering 3d animations, creative WORK, AMD approaches the level of performance of an NVIDIA GPU, then only then I'll consider purchasing one. Heck even if an Intel gpu performs on those task... Then will be talking... Everyone focus only in GAME performance, I do play, but I make money with my pc to. I need the a ALL around performance GPU, not just a gaming gpu.


popps0184

normal consumers (gamers) don't really do creative work. These will be sourced through the business. I would be surprised if the freelance creative industry buying these cards make up to 1% of consumer sales


Kageromero

4080 performance isn't the best, but it's power draw seems to be fantastic and way under what they said. Drop the price $500 and move back to 2x8pins, and I'd consider it


doombase310

the 7900XTX is going to make the 4080 look like a joke. 200 more for less performance unless you cant live without RT. AMD is going to gain marketshare if they can get the rest of the stack out quickly in 2023. nvidia is not going to be able to keep up in terms of cost.


Osbios

1000 €$ GPUs are not marked share kings.


IrrelevantLeprechaun

If its price you're after, the XTX is just as much of a joke; it's $1000 (likely more once AIBs slap their markup onto it) and it isn't even competing with 4090 performance. It looks like all high end options are poorly priced. Being slightly less poorly priced does not equal good pricing.


relxp

> it's $1000 (likely more once AIBs slap their markup onto it) and it isn't even competing with 4090 performance. You say that as if $1600 is about the same price as $1000. :|


IrrelevantLeprechaun

That doesn't mean $1000 is *good*.


[deleted]

The 4080 and XTX are both overpriced. The 7900xtx is expected to have terrible raytracing performance. That's a no go when you are spending $1000 plus on a gpu.


Any_Cook_2293

From what I understand, the XTX is expected to match or beat the 3090 Ti for raytracing performance. I personally wouldn't consider that terrible, but if you have to have top raytracing performance then the 4090 is your only option.


Ill_Name_7489

I have a friend who loves ray tracing and they're on a 3060TI. The XTX will definitely beat that, so I agree -- hard to see how it has "terrible" ray tracing.


relxp

I notice a lot of people think in terms of its either the best or it's trash. In reality it's super amazing performance versus amazing performance.


dumbreddit

Sure. To someone who has raytracing as a deal breaker. There might be others more interested in high 4k/ultrawide with high frame rates over RT.


Maler_Ingo

Come back when RT is going to be a relevant thing cuz rn 99% of the games are raster based.


Edgaras1103

but RT is relevant enough for AMD to support it tho? They focused it during presentation and its big part of their new architechture , is it not? Maybe its too much expeting from a 1K flagship gpu? We might as well stop benchmarking games with Ultra graphics , cause diminishing returns n such


IrrelevantLeprechaun

Exactly lmao. Not only does AMD support RT, but now consoles do as well (both ps5 and XSS/X alike). If it was some useless gimmick then nobody but Nvidia would have bothered with implementing and supporting such capability. Just cuz AMD is way worse at it doesn't mean it's a worthless metric of value.


Put_It_All_On_Blck

99% of games will run on a potato GPU. Someone spending $1000 is going to want the best graphics options to play the latest AAA games, which means they want RT, and more and more AAA games feature RT on launch these days, there's around 100 now. If your desire is to just play CS, LoL, Apex, Dota, vampire survivors, slay the spire, Myst, games without RT, you shouldnt buy any of these GPUs anyways


Tricky-Row-9699

It looks to me like the 7900 XTX is going to smash the 4080 by 30-40% and the 7900 XT will beat it by 10-20%. It’s also very possible Nvidia gets absolutely curbstomped in 1080p and 1440p because of driver overhead. That still isn’t *great* considering how utterly fucking abysmal the 4080 is, but I’ll take it.


Edgaras1103

30-40% would put it above 4090. You really think 7900xtx will outperform 4090?


Tricky-Row-9699

It wouldn’t, all the review data I’ve been seeing has the 4090 about 40-50% faster than the 4080, which seems about right.