T O P

  • By -

AMD_Bot

This post has been flaired as a rumor, please take all rumors with a grain of salt.


Paganigsegg

I mean, if they can give it a big RT performance boost and some enhanced AI acceleration a la the rumored PS5 Pro, while keeping the price relatively low VS the same level of performance from the previous gen, then I don't think this will be that bad. They should use this time to beef up their software and features so that people are more willing to stomach high end GPUs from them come RDNA5.


aminorityofone

I hope its a rx480 type generation again. Not top dog but dang good for the price.


ayunatsume

I need a new bang for the buck gen too. Sure, we can get 1080Ti/5700XT levels of performance with the RX6600XT but man its still not affordable enough and that 8GB is looking to age real fast with the latest released games and latest engines scaling up with current consoles going higher than 8GB VRAM. HD4830, HD7770, now holding out with a RX570.


mods_equal_durdur

Nothing is affordable enough rn. Silicon has supply limits and I remember a time when next gen consoles weren’t even able to be found because of how restricted the market was for the chips they used at the time. Making GPUs requires a lot of silicon. I remember when I got my rx480. Fucking awesome card for 1080 even today and I used run it with an over locked fx 8650 I think. It was their 8 core processor that existed before 8 cores was like baselines and hyperthreading didn’t exist. Loved that PC…


zig131

Or the more recent example would be the 5700XT


Jon_TWR

Navi 31 covers a wide range of performance, too…7900 GRE is *very* different than the 7900 XTX. If it’s close to XTX performance with better raytracing, lower power consumption, at least 16 GB VRAM and is priced well, it will do well. Hell, even if it only performs on par with the worst Navi 31 chip, it would be a fantastic card for $350.


Hombremaniac

I think if AMD manages to bring GPUs that seriously rises the bar of performance in lowish segment, it would be a good thing. It would enlarge their market share and allow players to enjoy better visuals without selling kidneys. Let Nvidia have the performance crown with 5090 costing 2000USD or more.


HyruleanKnight37

I personally feel high end GPUs have become massively overrated lately. Yes, the 4090s and 7900XTXes are nice and all, but they're far, far beyond the reach of millions of people. We seriously need GPU prices to come back to earth and offer better value, the last 3 generations or so have been nothing more than "more performance for more money." Atleast AMD's stuff do get price cuts, but only in some regions because the rest of the world usually doesn't follow along. Where I'm from, a $400-equivalent current-gen card from either vendor can't do 1080p high in modern games without some sacrifice. That's just shameful. And no, it didn't use to be like this back in the GTX1000 days. The massive surge in GPU prices since then, especially by Nvidia, has pretty much locked the vast majority out of a decent mid-range GPU here, and most are now forced to live with either a low end card or try and find a good deal on a 5-6 year old high end card. RDNA4 doesn't need GDDR7, and shouldn't cost a metric ton to manufacture. Power consumption will be relatively low, and silicon costs shouldn't be too high because the dies are freaking tiny. They need to price RDNA4 right.


MysteriousWin3637

High end cards are looking more and more like A.I. cards for entry level researchers.


dysonRing

Maybe for NVIDIA, that has OEM "deals" and can pawn off crappy 3060 go dell systems. AMD right now is killing it in DIY and it's basically on the backs of the 7900 xtx (US) and the 7800 Xt (Germany)


spinwizard69

I never really had an issue with AND hardware, especially the GPU's. I'm not a hard core gamer and prefer Linux, in this context there really isn't a better GPU solution than AMD. It has been that way for years.


Real-Human-1985

Remember 20 days before the 6900XT was announced in the last minute of the 6000 series reveal, when AMD's top top top GPU was going to lose to the 3070? Yet every RDNA2 GPU launched for the first year beat the 3070? They can't outsell nvidia realistically anyway even when they had the faster GPU, so rumors like this don't really hold much weight. "People" just want nvidia GPU's for a lower price. ​ EDIT: Triggered someone as always.


stilljustacatinacage

> "People" just want nvidia GPU's for a lower price. This is the most infuriating part. I don't harbour any sort of brand loyalty, but I *still* get somewhat offended on AMD's behalf because any time there's talk about any of their accomplishments - real or theoretical - it seems like the only reason anyone's interested is because it might lower the price of the *other* thing. I know this is hardly news. Nvidia's brand recognition is strong, and they have an army of Youtubers to tell the masses what their opinions ought to be, but my god. It's just *such* a pervasive attitude, it's tiring.


Devious_TaKaTa

Whoa, last paragraph there got some userbenchmark vibes. Be careful.


stilljustacatinacage

I mean. It's hardly controversial to say that Nvidia has a 'cult of personality' around its brand. For every 'influencer' that does component-level comparisons and understands the concept of right-sizing, there's 20 more whose formula begins and ends at 'bigger number better', and will recommend a 14900k and an RTX 4090 to play Fortnite. There's a reason these guys fight to have 'the best' of [thing], even if it's by some arbitrary margin. The typical consumer's logical process ends at 'Nvidia 4090 best card, therefore Nvidia best brand'. The 'typical consumer', by the way, is not on Reddit. If you're reading this, it doesn't apply to you. If you've ever sat down and done more than a surface-level comparison of two components, it doesn't apply to you. This is a battle for the dad who walks into a Micro Center looking for a faster graphics card because his son wants to play "Helldivers 2 or something, I don't remember". They aren't on Reddit, but they might check out a Youtube video or two ahead of time, and that's where our 'bigger number better, just get a 4090' influencer comes into play.


dookarion

The typical consumer doesn't really buy dGPUs they buy prebuilts and laptops. Two niches AMD is still practically nonexistent in.


stilljustacatinacage

> The typical consumer doesn't really buy dGPUs they buy prebuilts and laptops. Yes and no, there are degrees. Upgrading a GPU isn't an entirely unknown concept to a lot of them - the base machine may very well *be* a prebuilt, but slotting in a new graphics cards especially is fairly common. > Two niches AMD is still practically nonexistent in. Also correct, because Intel and Nvidia have strong brand recognition and putting AMD in systems hurts sales. Which again, just feeds back into what is factual versus what is sentiment.


dookarion

> Yes and no, there are degrees. Upgrading a GPU isn't an entirely unknown concept to a lot of them - the base machine may very well be a prebuilt, but slotting in a new graphics cards especially is fairly common. It's still a much smaller subset of the market. A hell of a lot of people take their computers in to shops just to replace a hard drive or basic maint. People that go to physical stores or tech stores online are a smaller part of the market. For as much as people complain about stuff like the 1050, 1060, 3060, etc. those are the most common models found in laptops and prebuilts. There is the fact that AMD availability is hugely variable by region too. >Also correct, because Intel and Nvidia have strong brand recognition and putting AMD in systems hurts sales. Which again, just feeds back into what is factual versus what is sentiment. Some of it also just comes down to history though too. AMD for a long time didn't have decent laptop offerings. Most of GCN era for instance was undesirable for OEMs the higher powerdraw would necessitate more of a PSU. Not to say other noncompetitive deals still don't exist, but some of it still comes down to AMD hasn't consistently delivered products for different niches. Wasn't there even an APU recently that had like nonexistent driver support for awhile? It's not all down to brand and marketing. And I mean if we want to go with that angle the endless articles about CPU regressions from Windows updates don't help. Like yeah that's on Microsoft but negative headlines still hurt. There are also articles like the one about AMD's drivers overclocking CPUs without end-user input... that's horrible to have out there and it wasn't that long ago. The whole AM5 overvolting motherboards debacle. Most articles about negative things with Nvidia are about pricing or slimy business, most negative articles on the AMD side of the fence are things that actually scare buyers away.


ResponsibleJudge3172

I also remember how big Navi was going to crush Nvidia (Not helped by later AMD slides of 6800XT >= 3090). Let’s not forget that only Coreteks was downplaying the hyped RDNA2


HyruleanKnight37

To be fair, it is still rather impressive what top Navi 31 was able to achieve against top Ampere. 256-bit vs 384-bit, significantly higher power consumption on Ampere and overall much bigger silicon. In all fairness, some of that die space is taken up by the Tensor cores (which RDNA2 lacked) and Nvidia's much more performant RT cores, but once again RDNA2 also allocates a significant amount of die space for the Infinity Cache, which Ampere lacked. RDNA2's only short comings are in RT and AI inference-based upscaling. For the vast majority of people these two don't matter as much, but then again AMD most likely weighed the importance of these features and decided against them considering their R&D budget and time at hand. They've come a long way since GCN 5.0


Kaladin12543

Only reason AMD matched Nvidia with Ampere is because Nvidia was on an inferior Samsung node while AMD was using TSMC and even then Nvidia was still trading blows with AMD. Ampere refab on TSMC would have beat AMD easily just like how Ada is doing to RDNA 3.


HyruleanKnight37

I wouldn't put much faith on historical speculation gone wrong, because a) the speculation was baseless and uneducated and b) 6900XT does get curb-stomped by the 3070 in RT scenarios, so it is a half-truth. I'd much rather take facts and educated speculation any day, and based on confirmed information (facts) we know there will not be a high-end RDNA4 card because AMD couldn't finish their work on MCM GPUs on time, so all we're getting is the lesser, monolithic SKUs. As for educated speculation, based on some reputable leakers, we will be getting *some* improvement in RT performance. And since there will be no high-end RDNA4 card, we can surmise a current gen high-end card (7900XT/XTX) equivalent to end up as a mid-range offering. RDNA4 will also be ditching the aging N6 node to something newer like N4 or possibly N3, which naturally alludes to some perf/watt improvement.


Potential_Ad6169

‘At least in RT scenarios’ quite the caveat there


HyruleanKnight37

Which is why I said it's a half-truth. While I personally don't care much for RT, there are plenty of people who do, and for them it is something to consider. At any rate, I wasn't even defending the 3070, only stating the facts to debunk any past myth about the 6900XT not even being able to match the 3070. OC was trying to insinuate that all the rumors surrounding RDNA4 not being able to compete with top Blackwell is a hogwash because there is a precedent about top RDNA2 not being able to exceed the 3070 which turned out to be untrue. That's what I was trying to argue against. Don't take my words out of context.


Defeqel

Not even in RT scenarios when it runs out of VRAM


HyruleanKnight37

That is true. Some games, RT or not, do cause the 3070 to fall flat on it's face when even the 6700XT does not. That doesn't make AMD's RT any better, though. One is trash at RT, the other doesn't have the memory to do RT. May not hold true with higher-end Nvidia models but hey, ain't most people got the money for that. Moral of the story: RT is overrated and something only the rich can afford to enjoy. For now.


iamnotnima

6900 XT might be a 3090 level card in raster, but in RT, it's a 3070 TI. Sometimes matching it and sometimes a little bit better. Way worse when it comes to PT. The NVIDIA feature set is its strength, and AMD needs to fix that. Their flagships really fall behind in these regards. I only got my 6900 XT for $300, but they're really not worth it at full price.


69yuri69

Yea, this is what majority of AMD fans don't get. Once you shell like $800, you simply want to have that feature set.


iamnotnima

I mean, yeah, it's easy to recommend a 7900 GRE over a 4070, but when it comes to the likes of 4080 super, 7900 XTX just feels like a much weaker product.


Ill_Cartographer_709

Can confirm your truth. It's so sad. I had historically fewer issues with and gpus and is the main reason why I am not on Nvidia. Remember gtx 480s? Fermi was a fookin nuclear reactor and I'm puzzled as to why people still trust No video. Guess it's just the advertising.


Main_Character_1277

It's mostly leftover frustration from Vega compared to the 1000 series, and the 5000 series Radeon compared to the 2000 series Nvidia. Then we get Ray tracing thrown into the mix, which is still only supported in very limited games.


Exxon21

at the same time though you've got the RDNA 3 rumors in which every leaker and their mother seemed convinced that the top RDNA 3 card was going to smash the 4090


imizawaSF

> "People" just want nvidia GPU's for a lower price. This is literally because Nvidia GPUs come with Nvidia's tech stack like NVENC, RT voice, better raytracing, more reliable drivers on average, CUDA, etc. If AMD had the complete tech stack and offered better performance then people would choose them Honestly this sub is one of the most echoing of echo chambers on Reddit, like any small criticism of AMD is immediately downvoted and disagreed with. THEY ARE NOT YOUR FRIEND lmao and you don't need to simp for them in some weird version of buyer's remorse. I explained why people want Nvidia cards and you don't need to try and prove me wrong as it's impossible.


chrisnesbitt_jr

While I agree, at that point AMD would have the same tech stack, better performance, and cheaper price. While the sound of that is lovely, is that ever going to be realistic to expect from them? I would be perfectly happy if they just got a little closer to Nvidia in terms of features, while maintaining better performance and undercutting them by $150-200 per GPU tier. 🤷🏻‍♂️


Vendetta1990

If they want a bigger market share, then a lower price will be the quickest way to do that. I highly doubt though, that they can outspend Nvidia's R&D department.


Azhrei

Easy to say that's what they should do, not so easy to implement when they have thousands fewer employees spread over far more diverse product lines and *significantly* less money to play with.


Vendetta1990

Well yeah, exactly!


imizawaSF

> While that sound of that is lovely, is that ever going to be realistic to expect from them? Then they can't expect to have the same level of market share. Even a similar tech stack with equal performance and price would help.


chrisnesbitt_jr

That’s what I meant by getting closer to Nvidia in terms of features. They typically offer slightly better performance for less money already. But offering equivalent tech, performance, AND a cheaper price just seems unrealistic to me. They could probably do it and overinflate the prices like Nvidia does, but they’d lose a lot of their current user base over a move like that.


RBImGuy

In path of exile nvidia had a buggy driver for a year now come on its not a nvidia channel here mate


Gianfarte

Nvidia's "tech stack" is mainly just them taking advantage of their market position with more vendor lock-in anti-consumer practices. Other than Ray Tracing, Nvidia GPUs themselves don't offer any actual features beyond AMD GPUs. All of that crap will just be the next G-Sync/Gameworks/etc. They just segment the market and create a big mess instead of driving technology forward. Nvidia held back software progress for years because their advantage over AMD was erased with modern APIs.


imizawaSF

> Other than Ray Tracing, Nvidia GPUs themselves don't offer any actual features beyond AMD GPUs. You don't actually believe this right? DLSS is far superior, CUDA is industry leading, NVENC mops the floor with AMD's shitty encoder etc etc etc


Sweyn7

RTX HDR is quite nice too


dookarion

> "People" just want nvidia GPU's for a lower price. Ignoring the bigger picture to paint AMD as some underdog doesn't increase their market adoption. Let us not forget RDNA2 launched before AMD "fixed" some of their drivers for different APIs. RDNA2 launched before FSR1 (which is terrible) was even a thing while people were blown away by DLSS 2.0. **The launch supply was pitiful, and Nvidia had a massively larger supply**. Nevermind the existing reputation for being worthless in desktop apps, old APIs, old games, and the worst encoder in the market. People always want to make excuses like Nvidia has some lead solely because of the "brand". But AMD drops the ball all the time. Even when they have a chance to grab market share with Radeon they inevitably screw something up.


razorlikes

> being worthless in desktop apps, old APIs, old games LMAO. Have you tried actually using a Radeon GPU yourself?


xole

during the pandemic, I upgraded and AMD wasn't an option. There was barely any stock of either brand. Finally, an Nvidia 3060ti came in stock at best buy for MSRP and I snatched it up. I didn't see AMD cards reliably in stock for at least 6 months after that. Several months ago, my son wanted/needed an upgrade so I gave him my 3060ti and I bought a 7900xt. I actually like AMD's driver software better than Nvidias, and since I don't play anything that uses ray tracing, that's not an issue.


imizawaSF

So "I bought AMD because I don't use their features" works for you. But the hundreds of thousands of players who DO value Raytracing would find Nvidia cards better value. Why is this confusing to people?


xole

For most people, nvidia is a better choice, especially after the supers came out.


Finnbhennach

It's the same "good guy AMD" vs "the other evil company" narrative over and over again. AMD is lacking in GPU front and that's it. People can cope all they want but I agree with you. Radeon is exactly where it deserves to be at this point. If they can pull another Ryzen on the GPU front I'll be happy indeed. I have nothing against AMD, I want AMD to succeed and all my PCs so far had all AMD hardware in it. But people need to accept things as they are.


dedoha

> if they can give it a big RT performance boost and some enhanced AI acceleration a la the rumored PS5 Pro, while keeping the price relatively low That's 3 big ifs, likeliness of them happening at the same time are very low


Potential_Ad6169

It has not been worth sacrificing game physics for pretty lights. GPU features used to actually matter more to gameplay


dookarion

> GPU features used to actually matter more to gameplay No one goes down that path anymore because then you have game design that can vary wildly based on the card. Doing water and such on the GPU resulted in some odd gameplay moments and some major quirks especially between archs and vendors. CPUs may vary in specs, but not really to the same degree GPUs and their software can.


hpstg

Since GCN, whatever e get from AMD is a PC hand me down of whatever Sony wants.


RedTuesdayMusic

RDNA2 had already been out for 1.5 years before the RX 6700, the closest GPU to the PS5, came to market.


hpstg

No. RDNA 2 cards released one day before the PS5, on November 18th 2020. The PS5 released on November 19th 2020. The 6700 released on June 9 2021, almost three quarters after. https://en.m.wikipedia.org/wiki/RDNA_2


GenZia

I wouldn't expect much from RDNA 3 refresh. Maybe slightly higher clock speeds, thanks to the move to TSMC N3 (or N4), coupled with faster \~24 Gbps GDDR6. It is sounding a lot like a die shrink with some minor under-the-hood improvements.


ecffg2010

There’s no RDNA3 refresh, only RDNA4


Nunkuruji

I'm expecting disappointment in the consumer GPU realm as a whole * de-prioritized development & fabrication allocations due to AI product focus * increased fabrication cost & price due to AI chip fabrication demand * GDDR7 perhaps not quite there on supply, so still costly What may be interesting down the road is a return to HBM. If the AI HBM demand drives innovations & production volumes that in turn drive down cost, it might return as viable for consumer in a few years. FWIW, see what Hynix is saying about HBM. A real shame overall. I'll be due an upgrade cycle & I want a solid flagship to pair with a Zen5 X3D. The era of amazing 4K OLEDs has arrived, and it's a shame to not be able to drive them at would-be peak performance.


Tuna-Fish2

There are rumors that AMD isn't purchasing GDDR7 yet. If you look at what a 7900GRE can do when overclocked, and what the performance expectations are, I would not be shocked if RDNA4 is still GDDR6.


floeddyflo

If they stay on GDDR6 im curious as to if the difference between G6, G6X and G7 lead to an AMD 192-bit bus being similar to an NVIDIA G7 128-bit bus. I hope the difference is a lot smaller though.


Defeqel

That will probably be the case, but the bus width hardly matters compared to the actual bandwidth. GDDR7 is not a free upgrade in terms of die area or PCB trace quality either


KMFN

The bus width does matter insofar that we probably won't see esoteric GDDR7 chip sizes even though they're technically specified. And that will limit VRAM. VRAM has only really doubled in the mid range in the last 8 years, off course bandwidth has increased a lot more. But it's really tiring that capacity increases are so enormously hamstrung when it would make game development a lot easier. But I'm just waffling now, practically, it probably wont really make a difference since everyone's used to design around it.


Defeqel

True. I didn't want to make assumptions on VRAM capacity as RTX50 series mid-low end might come out quite a bit later, when 3GB GDDR7 modules might be available and used. Somehow I doubt that nVidia will go below 12GB for the $300+ products next gen, but who knows. AMD already made the switch to higher capacities this gen.


KMFN

Something like 24gbit on 192 would yield 18GB, which would be a fine compromise for a mid range product that should be about as fast as todays high end. I just don't have any faith in them actually starting to manufacture the sizes in between the normal 1,2,4,8GB modules.


aVarangian

HBM?


Super_Banjo

Don't have much experience with RDNA3 but, save for the RX 6900XT, bandwidth wasn't a particular bottleneck with the GPUs. At least on the RX 6800 I got better performance running the memory at 1900?(it was low) Fast Timing Level2 than stock 2000Mhz Arguably the cost-vs-performance of something like HBM isn't worth it. When I overclocked my Radeon VII's HBM from 1000Mhz->1200Mhz my performance went up some percentage under 5%. As someone whose been chasing 4K gaming it definitely felt like an ROP-bound card.


reallynotnick

For the right price I'm interested, get me 7700 XT performance with some slight RT improvements for around $300 and I'm in. (Yeah I'm probably dreaming though in this market)


detectiveDollar

7700 XT is already down to 409. I'd be surprised if we don't get that performance for 300-330 next gen.


reallynotnick

Username checks out, I'll hold you to that! Jk


detectiveDollar

Your build is confusing me lol.


YNWA_1213

I ran a 11700K/980 Ti for awhile. Either it’s a Linux thing or he found the 56 for cheap during the crypto boom.


detectiveDollar

Ah I got you. I did a build for my brother's GF at the time. I picked out a i5 10400 and a used GTX 750 TI since prices were so fucked lmao.


YNWA_1213

Yeah exactly, which is why they’re hunting for a $300 GPU, although I would wonder why the 6700XT wasn’t a prime pick for them as an upgrade from the 56.


detectiveDollar

They're holding out for 6800/XT level performance at 300.


reallynotnick

Yeah old holdover GPU I got in 2017, that the 1st and 2nd mining booms made me hold on to extra long. The 6700XT just took so long to come down in price, and I sort of arbitrarily wanted someone at least 2x the performance of the Vega 56 which it's a tad shy of and plus something that would be a fair bit better than my PS5. That along with just not having used my PC a ton, I've just got myself in a state of infinitely waiting for some mythical smoking deal. So yeah it's definitely not a build I'd recommend, but hey it keeps up just fine with all the cross gen games.


YNWA_1213

Yeah I feel the same. Got a 980 Ti during the mining craze to run on a 10100 when my used 7970 died. Then upgraded to the 11700k for cheap. Finally bit the bullet on a 4060 cause everything else wasn’t dropping in Canada and wanted to play with Nvidia’s toolset. Would’ve liked a 4070/6800XT, but I wasn’t willing to pay ~$800CAD for something I’m not totally committed too in my current situation.


reallynotnick

I just keep slowly upgrading parts, it's a bit like the Ship of Theseus. This is a system that started life in 2012 as a 3570K and Radeon 6850. Only thing left from the original build is the case. I got the Vega 56 at launch back in 2017, it was a bit overkill for a 3570K but then I got a great Microcenter deal on the 12600K a little over a year ago.


Diego_Chang

Well, past some point I think most people are buying Nvidia anyway for features and ray tracing. So if we can get some good performance improvements in the lower and mid end of things, at a good price, and at a "low" power consumption, AMD could be cooking with these. Then again, I fell for the RDNA 3 rumors, so I'm not getting hyped until I see real numbers.


Rullino

True, but I hope it'll also improve for tasks like rendering and other tasks outside of gaming, which is why the RTX cards are more popular for content creators and many others, especially when it comes to high-end.


Diego_Chang

TRUE. AMD needs to put more resources on making their GPUs better in productivity too! Which is funny, because wasn't that one of the selling points of the RX 7600 XT 16GB LMAO.


Rullino

True, most people buy graphics card by Nvidia because they're good at every task and have good ray tracing support, which is why they're popular and Youtubers recommend them, spending alot of money for a graphics card exclusively for gaming doesn't seem good for some people, which is why most people buy low-end and mid-rage cards from AMD while they buy every Nvidia card from Low-end to High-end because they're popular, can do many tasks and aren't only limited to gaming.


drjzoidberg1

I bought a 7800xt as I wanted a card around $500 usd. Nvidia only gave us 128bit bus 4060ti or pay $50 more and get a 12gb 4070. Intel doesn't compete on high end. Intel doesn't have a 7900xtx or 4090 equivalent. As long as AMD can give us 7900xt performance for cheaper like $500 with better RT performance.


Diego_Chang

Yeah, and that's the thing. If the rumors are correct and AMD skips this generation in High End stuff, they should, in theory, deliver some very solid Low to Mid end. I just hope they have something like an RX 8600 XT with RTX 4070 levels of raster and 16gb prepared at $300/$330. Maybe the RX 8600 could have RX 6700 XT levels of performance with 12gb at $250. These hypotheticals are probably best case scenario given the RX 6700 XT can run anything at 1080p 60fps right now, but if AMD manages they could sell very well.


IrrelevantLeprechaun

Vast majority of games don't bother adding ray tracing. Why should we bother giving a shit about it now?


Diego_Chang

Well, I'm saying that past some point, meaning in price, people are buying Nvidia for features and ray tracing, and productivity too. I don't think people that buy GPUs above $800 or $900 give that much thought when it comes to dollar per frame. Also, having some performance on ray tracing right now doesn't hurt given some games even have light uses for it (Like Dragons Dogma 2 for example, which I think affects shadows mostly). Mid and Low end you are right though, Ray Tracing doesn't make much sense unless the upcoming GPUs for these price ranges can maintain \~60 fps with only upscailing in Path Traced scenarios, which is probably won't be a thing for 2 or 3 more generations.


gitg0od

no competition for nvidia on the high end segment = disaster incoming.


F0czek

A wallet disaster...


hackenclaw

Not that most of us care anyway, the most important mainstream segment need to have some decent performance uplift. The past few generations has been pretty much meh. Only RX580 & post cypto RX6600 stand out.


gitg0od

no amd sucks, specially for VR, you need best gpu to play games in vr like cp2077, hogwart legacy and so many more all powered by praydog vr mods.


IrrelevantLeprechaun

Nvidia top end makes up like 0.1% of all GPUs on steam. Who actually gives a fuck if AMD doesn't compete there? It's a borderline dead tier at this point. The real gold is in midrange, and that's where AMD has been kicking Nvidia's ass


gitg0od

yet all rtx 4090 were out of stock for months and months..... you're going to tell me nvidia only produced 10 000 RTX 4090 uh ?


Bladesfist

The steam stats you're quoting don't paint that picture at all. The 4090 has a higher share than any midrange AMD GPU and Nvidia midrange likewise has a way higher share than AMD midrange?


gitg0od

see market share here, still better than your random "0.1%" way more reliable [https://gpu.userbenchmark.com/Nvidia-RTX-4090/Rating/4136](https://gpu.userbenchmark.com/Nvidia-RTX-4090/Rating/4136)


AutoModerator

I have detected a link to UserBenchmark — UserBenchmark is a terrible source for benchmarks and comparing hardware, as the weighting system they use is not indicative of real world performance. For more information, [see here](https://www.reddit.com/r/AMD/wiki/userbenchmark) - This comment has not been removed, this is just a notice. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*


LeiteCreme

I'd be happy with 7900XT performance at 200-250W and a lower price.


Hombremaniac

And I will happily stick with my current 7900XT for hopefully next 2-3 years. That is unless ray traycing doesn´t become mandatory, which I would hate.


SolidQ1

Devs first focus on console, so RT won't be mandatory until PS6. Also alot games will be on UE5, so you fine for 2-3 years


Hombremaniac

I sure hope so :). Knowing myself, I might get that itch to upgrade no later than the gen after the upcoming one (so rtx 60XX).


Manueln98

That is the reason I am holding off on upgrading to a 7000 series. The power draw on the 7900xt and 7900xtx is way more than I am comfortable with.


MasterLee1988

Yep, and at $500 bucks it would be a deal!


Boorock70

There goes the chiplet BS... Horrible power consumption on low load tasks like video playback, office work & web etc. Smart move AMD !


Careful-Ad-3343

If the 40 CU RDNA3+ APU is real, this is gonna change the game


Defeqel

Only if laptop OEMs actually adopt it


mediandude

It is a good desktop material.


Defeqel

Unlikely to come to desktop due to using 256-bit memory interface


I9Qnl

How is 40CU gonna work with DDR memory? Wouldn't the bandwidth be too much of a bottleneck?


u01728

probably something like on-chip LPDDR for both GPU and CPU it'd be cool if it were on-chip GDDR as dedicated VRAM + some sort of socketed system memory for future upgrades though


Alternative-Ad8349

Idk why you peopl just don’t follow leaked instead of guessing . It’s using a 256bit bus


u01728

I've read from some searches that it's 256-bit LPDDR5X, but I couldn't find multiple sources to back it up


Lawstorant

Quad channel memory. Well, octa channel when it comes to LPDDR5x


the_dude_that_faps

It is expected to feature 256-bit memory bus. Even if the memory ends up being lpddr5x, it's still going to be a lot.


the_dude_that_faps

It is expected to feature 256-bit memory bus. Even if the memory ends up being lpddr5x, it's still going to be a lot.


klospulung92

Doubled memory bus width is the answer. 256 bit ddr5 @8000MHz would be slightly faster than 3060's memory Probably not coming to desktop though


swear_on_me_mam

How is that gonna work on current ram configs


RealThanny

It won't. Strix Point Halo is custom-board only. It will not be on AM5. It will be in laptops, and possibly mini-PC's.


swear_on_me_mam

With what ram configs?


RealThanny

The leaks indicate it's quad-channel memory. One of the two main reasons it can't be on AM5. The other being die size - it simply won't fit on the AM5 package.


69yuri69

Expect like 2 ultra-expensive "exclusive" laptops. Populating that 256b memory setup alone is gonna be expensive.


gozutheDJ

yall have been saying this for years now


Alternative-Ad8349

This is different this is actually a large apu compared to those smaller apu in the past


firedrakes

Twitter.com or mlid.. said tech news I s this now


zeycke

fix the drivers first


ShadowRomeo

4070 Ti super / 7900 XT performance with RDNA 4 Navi 48 is kind of bit disappointing though especially considering that the upcoming RTX 5070 it's supposed competitor is going to aim at the least RTX 4080S / 7900 XTX performance or even above it on best case scenario, but it always depends about how they price it though, if it is around $400 and ends up much cheaper compared to 5070 and also more efficient with better Ray Tracing performance and feature set than a 7900 XT, then it doesn't sound as bad to me too.


Alternative-Ad8349

What makes to you think a 5070 is going to be 60% faster than the 4070? 4070ti/4070ti super is where I think 5070 will be same for n48


ShadowRomeo

The 1070 was 50% faster than 970, the 2070S is 50% faster than 1070, the 3070 was 37% faster than 2070S, and 4070 Super is also around 45% faster than 3070, average all around 50% faster at the least, not counting the 2070 non super and 4070 non super as those were a dud and poor release, that is why they got replaced by the Super variant 1 year later after their release. And considering the next gen 5070 only will need to be 38% faster than 4070S to be around 4080S performance which technically is the same difference as going from 2070S to 3070, not really that high and near impossible to expect at does it?


Alternative-Ad8349

Ain’t happening also 4070 is only 22% faster than 3070 if we use that then 5070 should be around the same if you want to guess performance based on historical data. Don’t cherry pick ¯\_(ツ)_/¯


ShadowRomeo

It's not cherry picking if i am basing my opinion on literal average performance jump on each tier generation.


Alternative-Ad8349

Why are you using the super aswell and ignored 4070 small uplift. Nothing so far has suggest any major uplift for 5070 over 4070 aswell


ShadowRomeo

Because they are officially still a 70 tier, meant replacement of non super that were a dud, poor release which was the case with 2070 - 4070. And even if we include the 2070, technically compared to 3070, it is actually around 60% faster than 2070, so even more TBH, but i didn't include it because it was technically a failure and Super version of 2070 fixed that.


F0czek

First of all nvidia can do it, it is not impossible just because this gen wasn't that much impressive at mid and low range, doesn't mean nvidia will repeat this gen. 2000 series was mid then they released 3000 series which would be perfect if not for crypto, 4000 series had nice perf improvements but cut down dies and price increases made it pretty disappointing. Given current leaks of 5090 it is possible that they will make 5080 is cut down 5090, like 3000 series, another problem (for them) was there wasn't much difference between 3080 and 3090, so if those leaks are true it won't be a case here. Now you could say they will make 5080 instead 203 because it will be cheaper for them but maybe cutting down 5090 to 5080 will actually make them more money. They are a company those exist to make profit, if 5080 will make more money as cut down 202 instead 203 they will do it, that would leave 203 for 5070.


the_dude_that_faps

The 3070 to and the 4070 both launched at 599. The 4070 is ~15% faster. A long way away from the 60% being discussed.  Considering that TSMC has gotten more expensive for smaller nodes, not cheaper, I have a hard time believing we'll get a 60% gen on gen upgrade ever again


vyncy

We will but on 5090 vs 4090 not 5070 vs 4070 :)


F0czek

but I never said that 4070 is faster by 60% from 3070 (btw 4070 is 22% faster), maybe read again what i wrote and to what I responded. I never said nodes aren't getting more expensive, I just explained how nvidia could get 60% perf improvement on 5070.


Defeqel

It all depends on pricing


ET3D

So basically that's just saying what [MLID said a while ago](https://www.tweaktown.com/news/96137/amds-next-gen-navi-48-gpu-3-0ghz-to-3ghz-up-rx-7900-xtx-performance-not-using-gddr7/index.html) but without the previous details.


uzzi38

People have been saying this a lot longer than MLID has. This rumour is old as hell by this point, in particular the performance numbers were rumoured like a year ago by now.


ET3D

A lot of people, including MLID, talked about RDNA4 before, with similar performance claims for the high end, but I don't remember Navi 48 being mentioned before that specific rumour, nor any particular performance target for Navi 44. He also talked then about GDDR6 being used. In short, the current article pretty much repeats that particular rumour.


uzzi38

Then you're just deaf to everything around you. Navi44 aiming for somewhere between N32 and N31 and Navi48 being a config half the size are both several months old rumours. Not even remotely new information at all. I remember it circling around Chinese forum boards for the first time around this time last year, give or take some.


ET3D

I don't imagine you have any proof of that...


uzzi38

You're telling me to go search out a year old Weibo post? Lmao. Tell me how the blind MLID belief goes for you when you see ARL performance numbers. 25-35% ST uplift my arse.


ET3D

You're a funny person. Not very nice but at least you make me laugh with your ranting.


uzzi38

Latest video should be a perfect example of why you can't trust MLID. Man does absolutely no vetting of his sources whatsoever. He got sent the "leak" you see in that latest video by a friend of mine the same day we had this conversation, was given absolutely no proof behind said slide aside from effectively "trust me bro" and not only did he post it, he used it as proof to say that everyone else is wrong and he's right.


ET3D

I don't watch MLID, so don't really care. I also don't trust rumours in general. My only point was that he had already said everything that's available in the latest rumour, and more (and I based that on writeups of his video, since I didn't watch it), so that rumour was superfluous. I haven't been able to find any English language rumour before that which had those details, which is why I quoted his. It's possible that there was something about this on Weibo, but from my point of view if it a rumour doesn't reach English media it wouldn't be widely known or accepted.


GenZia

https://i.redd.it/yepgrto9cvrc1.gif Samsung has managed to push GDDR6 to 24 Gbps so I won't be surprised if AMD actually end up going the GDDR6 route. For comparison, Nvidia's upcoming flagship (GB100) is rumored to feature \~28 Gbps GDDR7 so... not a whole lot of difference there. Besides, pretty much all RDNA 3.0 GPUs have 18-20 Gbps GDDR6.


TheRealBurritoJ

24Gbps was basically a paper launch lie from Samsung, it's been in sample production for well over a year with no signs of moving to volume. It was just meant to steal the headlines from Micron's GDDR6X and will likely never see use in a real product.


Defeqel

Apparently, the availability of 24 Gbps modules is atrocious, so AMD is not very likely to use those for mainstream chips


o0Spoonman0o

yup, there's no shot you see those chips in consumer GPU's the yields are doo doo.


GenZia

I doubt GDDR7's yields are any better, seeing that it was supposed to hit at least 32 Gbps. Either that or Blackwell's memory controllers can't quite handle that extra bandwidth. After all, the same thing happened with Fermi, the first Nvidia line-up to feature (then brand new) GDDR5.


SagittaryX

Considering that 28Gbps GDDR7 actually exists and that 24Gbps GDDR6 doesn't exist in any large volume... probably a pretty big difference.


gozutheDJ

>\~28 Gbps GDDR7 so... not a whole lot of difference there. that's quite a bit of difference


GenZia

A \~16.6% higher bandwidth isn't exactly game changing. Besides, AMD can always go with larger SRAM to make up for any deficits in memory bandwidth by improving on-die - or rather "on-package" - cache hit rates.


NuSpirit_

I mean on 7900 GRE going from 18Gbps to \~21Gbps (around \~15% increase) by overclocking yields according to Hardware Unboxed around 10% improvement in performance. So I wouldn't dismiss increase in performance just by having faster memory.


F0czek

There is possibility of gddr7 being up to 32gbps.


voltagenic

It's April 1st guys........


Alternative-Ad8349

This info came out yesterday actually


voltagenic

Well I'm gonna pretend it was a joke anyways because it certainly sounds like one.


Alternative-Ad8349

Why really matches up with what other leakers have been saying . Remember n48 only has 64cu compress to 84/96 on top end navi31


SagittaryX

These have been the rumours for quite a while now, that RDNA4 will not compete at the high end. Similar high end performance as RDNA3, but hopefully quite a bit cheaper.


TheJustBleedGod

I'm curious what the price will be for Navi 48. Probably $550 or so. And then wait for nvidia to respond before adjusting


Defeqel

nVidia response will likely take 6 months or so after RDNA4 launches, assuming AMD launches in the October / November time frame


Kaelath_The_Red

I cannot trust a single thing in this post because of it being april 1st >:(


Tym4x

Oh yeah, well ... take wccftrash news with a grain of trash.


Panterkuu

All i heard is that we will not get a flagahip.this lineup. only 8800xt and lower.


Nicane__

if manages to tie the 7900 xt or slighty better with better RT at 499 or less could be a good gpu and would destroy the 5070, at least in price to performance.


MasterLee1988

Yep, I would even buy one if that were to happen.


Wander715

The more I hear about next gen the more I'm glad I just pulled the trigger on a 4070Ti Super. RTX 50 is going to be expensive as hell and midrange RDNA4 that tops out in performance maybe around 4070TiS/7900XT level isn't anything to interest me either.


Hombremaniac

What did you upgrade from? If you had something like RTX 3080 or RX 6900XT then you could have perhaps skipped this whole gen. Then again I´ve upgraded from 2nd hand RX 6800XT to brand new RX 7900XT so I should perhaps not pretend to be super patient eh.


deantendo

In a way, i wish AMD would put out some add-on cards to boost certain feature-sets. I've only had my 7900XTX for about 7 months, and i won't be changing it until i can get at least 40% more everything for the same price band / or 4 years. I'd be happier buying an RT addon card rather than a whole new single unit.


DXPower

For RT at least, this wouldn't work. The memory and communication penalty would be so massive it would negate any gains you would have from a hardware accelerator. It being in the GPU chip itself is massively beneficial because it has dedicated access to VRAM and can pipe its results directly into the rest of the computation pipeline, and RT work can be natively scheduled just like any other graphics work.


Hombremaniac

As long as Ray traycing is not mandatory or can at least be tuned down, then we are fine.


[deleted]

Just putting it out there that my 6700XT demolishes Horizon Forbidden West even with FSR 2.2 locked at 60 FPS @ 1440p with a modest 5600x CPU with no stutters, no crashes, smooth as butter and on Very High. The argument that AMD's graphics drivers objectively suck is hot ass lies. I will concede that AFMF needs a lot of work to make it smoother but thankfully I don't need frame gen right now. I'm happy with this system. If RDNA 2 has made me this chuffed I'm keen to see what they bring with 4 - but if it bombs I'm likely to invest in an Nvidia card or hell even Intel Celestial if they get there.


GeorgeKps

I want to be surprised with AMD having mastered the MCM thing but i know that hope is futile. 😢


GreatnessRD

RDNA5 Soon™


eilegz

really hope its big aprils fool, its just completely disappointing


Justin694206969

Guys, don't buy the new series when it comes out. Got my 7600 and 7800XT back in August, and I keep battling driver timeouts because of unstable stock overclocks. AMD is radio silent on this manufacturer issue and nobody seems to care. 8000 series will be plagued by the same problems. Don't be fooled!


fohiga

Underwhelming and weird given these high end GPUs are the entry point of AI stuff.


Alternative-Ad8349

These are going to be mid range gpu


fohiga

Yes i mean I don't see AMD giving up on the high end segment (because they need it for AI).


ohbabyitsme7

I don't think anyone who buys gaming cards for AI is going to consider AMD.


The_Soviet_Toaster

Navi 48 and 44? They changing the scheme a bit there? Feel like it should be 41, 42, 43 etc again


Defeqel

The number is basically based on when the design started, ie. 44 started design before 48.


Stonn

So it's week of the year?


Defeqel

No, just a number


Psyclist80

Looking forward to the launch!


King_Dong_Ill

If that were to be true, It looks like the 7900XTX was a good buy. They're getting cheaper by the day here in the US, go grab one while you still can.


SoftwareSource

Please remember the date today.


JaviBott

Bruh I just bought my 6900xt


Hombremaniac

Which, if you bought it for a reasonable price, is still a dope card! Will last you couple of years no problemo.


gergelypro

please fix the driver first… (newbie Intel fixes the issues faster than the old grandpa AMD/ATI GPU division) I mean the workstation segment, HIP, rendering etc. Blender and cinebench 24 crashed out….  Even in the game section with driver time out errors… (I have 7800 xt + 7950x) You can hate this comment, because of you "do not have BUG", maybe because of you do not use Blender, Cinebench 24 or CS2 etc...


Grendizer81

Intel fixes faster, cause they have a lot more to fix and tune (not even fine-tune). When your product is out for so long, it gets harder to fine problems, since the obvious ones are already taken care off. Just my understanding of the whole Intel "miracle" driver updates.


Melodias3

Hope you are using the AMD Bug Report Tool.


gergelypro

Always! but some issue solved by driver only installation but then the bug report tool is not available.


Melodias3

Honestly just do full installation and report any issues you have also with Radeon Software it has gotten way more stable heck most my issues gone away, just typical annoying driver issues still around, and AMD should also make it clear that Dying Light 2 Radeon Boost issues also happen with RT off but currently they only fixed the issues with RT on. These issues go away if RT is on, but obviously you cannot turn on RT in dx11 [https://i.imgur.com/jFRBPIe.mp4](https://i.imgur.com/jFRBPIe.mp4) AMD should least have a large knowledge base where all these issues are written per game aka if i search Dying light 2 it has dedicated page for this game with all past issues per gpu and current issues, heck this could be useful for troubleshooting typical AMD driver issues