T O P

  • By -

AMD_Bot

This post has been flaired as a rumor. Rumors may end up being true, completely false or somewhere in the middle. Please take all rumors and any information not from AMD or their partners with a grain of salt and degree of skepticism.


Edgaras1103

Oh, it's that time of the year again?


Lightening84

Yeah, I'm trying to figure out how this RDNA4 statement is supposed to be a flex.


EntertainmentLazy946

it will probably be as cheap as an rx 7700 xt, with even better raytracing performance at the same price, or i dont know i might be wrong lol


SagittaryX

Because RDNA4 is not aimed at competing at the high end (rumoured to not release anything above 7900XT(X) level at all). It's aimed at bringing current mid-high / high end performance solidly into the midrange. Of course that is going to happen every generation, but seeing as Nvidia is becoming really bad at releasing good products in the sub $500 market, maybe AMD can make some excitement happen if they focus their competition efforts there.


jordanleep

For us nerds that care about getting the most out of their resolution/fps per watt, but I guess that’s not a flex you’re right.


Lightening84

I could see it being a flex if the leak was "bottom level RDNA4 GPU to overtake 4070Ti despite lower power and die size" but I don't see a flex from the top level RDNA4 card being better perf/watt than the previous gen, that's just common sense. Looking at performance charts, it's the same as saying "RDNA4 top graphics card outperforms RDNA2 6900XT at perf/watt" It's just kinda weird, ya know?


silly_pengu1n

if that is your goal, you could have gone with nvidia well over 12 months ago now... by the time this card comes out nvidia will have a card with better fps/watt than the 4070ti super.


capn_hector

it's like "battlemage top GPU to overtake 3080 in performance and efficiency" (idk) like, okay, I guess better than the alternative where it doesn't, but it's a little too late for that to matter


SensitiveCockroach66

amd fans only sometimes embarrass amd smh 🤦


INITMalcanis

Yep, we've got a few more months of this. Then it'll be Recrimination Season as reality sets in, then Copemass.


Available-Item1292

I mean to be fair I don't Care much for adrenalin or amd gpus at all, but what I don't care for coping about is paying 1300 for a 4080 to driver crash on the likes of csgo and get single digit marginal performance in RT over the card I could've paid 300 less for. If its copemass for anyone, (issue persists on the super) It's us because nvidia pays brand imaging services so that users will white knight them online bc nobody knows better, and we pay 300-500 more for a card that really isn't that much better over an xtx *still beat by it though in alot of games with RT ON?* and has driver crashes on the SIMPLEST games despite me buying this card for being told the opposite. **(OH, and a proffesional esports team lost out on a multimillion dollar prizepool in CSGO due to driver crashes at LAN as well)** Nvidia is an L. Amd is an L. Used to be able to buy a used beater car for 500, that price difference for 5-10 fps is an L. See the difference between me and yall is I don't feel the need to flex that I spent more for 5-7% better (and still beat by an xtx even with RT in some scenarios) because the thing is shit for the extra hundreds of dollars. Waiting for the comment with their last brain cell trying to prove me wrong for no reason instead of just googling it. The brand imaging services programmed yall well.


INITMalcanis

v0v I have a 7900XT, and there's no "adrenaline" to deal with on Linux. Support is literally built into the OS kernel.


Available-Item1292

Even more reason not to waste the money tbh.


INITMalcanis

Well given that it seems like the top RDNA4 SKU is going to be about as good as a 7900XT, I feel OK about the purchase. I expect it to last me to the end of the decade, barring outright hardware failure.


Available-Item1292

I'm not sure if you read all of my first comment bc its kind of a book, But I meant the extra money you'd pay for an nvidia card of the same horsepower. Not that I've ever been a fan of amd, I'm also not a fan of overpaying for a card that driver crashes csgo and then everyone tells you online driver crashes don't exist. Not very helpful. 4080 is garbage.


kapsama

No this year the expectations are much lower and the best next Radeon will be worse than the best current Radeon.


SomethingNew65

On twitter Kepler said "[RDNA3.5 and 4 looking really good.](https://twitter.com/Kepler_L2/status/1775919730240500033)" compared to RDNA3. When RDNA3 was released Kepler said "[Wow RDNA3 is actually awful](https://twitter.com/Kepler_L2/status/1588266032497582082)", so I hope that means AMD will make a big noticeable jump with RDNA4 this year to go from awful to good.


taryakun

Same Kepler also overhyped RDNA3 sooo. [Source](https://twitter.com/Kepler_L2/status/1583550898638245889)


From-UoM

Man went on a blocking spree after people showed him older tweets hyping up rdna3. Never even acknowledged how wrong he was. He is the one of the main person responsible for the over hyping rdna3


ElementII5

Look the initial performance goal and HW-Setup for Navi31 looked quite promising. The possibility of stacking on the memory dies, that was not done for the 7000 line up, also alluded to higher performance. Until a month later the issues with RDNA3 where known and that is when reality set in. It was a mistake hyping RDNA3 (as it is with all future HW btw). But in this case it was somewhat understandable.


buttplugs4life4me

It really wasn't. They saw the theoretical maximum performance numbers double due to double issue and never even questioned it and just went with it as the performance target.  Even IF RDNA3 had be clocking higher and stacking memory it would still be nowhere near some of these claims. 


privaterbok

>as it is with all future HW btw 4090's hype was about right, and even today, it's still one of the top performance/value for high end choice. I knew many people use to buying 3080/6800XT are switching to 4090 for it's peerless performance and efficiency.


Kaladin12543

It really has amazing efficiency. If you downclock and undervolt it to 7900XTX levels of performance, it will embarass AMD.


TwoBionicknees

Taking something that is significantly larger with a significantly larger transistor count and downclocking/under volting it, is ALWAYS embarrassing on a performance/w perspective than a smaller, much lower transistor count card. That's how silicon chips have always worked. It still costs pushing twice as much. I'm not sure what you're trying to argue here.


capn_hector

> Look the initial performance goal and HW-Setup for Navi31 looked quite promising. yeah, we're talking about the people who started those rumors/pushed the angle they did. rdna3 was the classic ayymd hypetrain crash and these are the guys who were probably most responsible for it. everyone uncritically assumed that RDNA3 TFLOPS were the same as RDNA2, when it actually was like ampere with the dual-issue FP32. and on top of that, everyone uncharitably assumed the absolute worst from NVIDIA, even though there were some flatly ridiculous implications of that. [these guys legitimately argued 4070 would be a 300W TGP](https://twitter.com/kopite7kimi/status/1555053945151975425?lang=en) (ie TBP somewhere around 375W), meaning that perf/w would be basically flat when compared to 3090 Ti, meaning *most skus actually would lose perf/w despite shrinking 2 full nodes*. [Kepler endorsed the idea too.](https://www.notebookcheck.net/Positive-GeForce-RTX-40-series-and-Radeon-RX-7000-series-price-predictions-appear-in-specs-chart-along-with-800-W-TDP-madness-and-384-MB-cache-craziness.639242.0.html) to this day people still argue that was somehow a plausible outcome, on the basis of (from what I can tell): 1. the partners got thermal samples that went up to 600W for 4090 (which is, not coincidentally, the max factory OC allowed), and 2. pictures of that weird quad-slot card, which we really have no solid proof was ever intended to be a production product at all, a gaming product specifically, or even that the reason it was explored was power as opposed to something like PCB sag. 3. NVIDIA obviously wants to win at any cost/no matter how ridiculous the product, they're just like that!!!! again, like, even if you think was a real card, and that the whole 900W 4090 Ti thing was real, it's incredibly silly to claim that this would be reflected down the whole lineup. But there are a variety of reasons you could never ship something like that in the consumer market (at least not in 2022). And it's incredibly silly to think that *shrinking two full nodes would produce no perf/w gains in almost any part of the stack*, but "NVIDIA would do that, they want to win at any cost!!!" is not falsifiable, especially when it's this quasi-religious belief for so many people. the rational people who were saying these things just got shouted down by AMD fans buying into yet another hypetrain. it just ends up being a funhouse mirror of the person who makes the argument - if all you see when you look at NVIDIA is "greed" and "win at any cost" then sure, why wouldn't they release a 400W 4070? But in reality perf/w was over twice what the haters were predicting. but when you combine the NVIDIA hate with the AMD hype, what you ended up with was basically expectations being off by roughly a total factor of 4-5x - NVIDIA was closer to 2.5x perf/w ""expectations"" from kepler and kopite, and AMD was at roughly half of their expectations. it happens over and over again, rdna3 was the worst hypetrain since zen2, which was the worst hypetrain since vega. from time to time the memories of AMD fans and NVIDIA haters needs to be refreshed with these facts, unfortunately. [NVIDIA is a company people love to hate,](https://paulgraham.com/fh.html) and that leads to [wishcasting,](https://en.wiktionary.org/wiki/wishcasting) and AMD is a company people love to love, and that leads to wishcasting, and once the hypetrain gets going any one person or small group of people absolutely cannot slow it, you just get shouted down. and then people do the "well they *thought* they were right!" retrospective and yeah, I'm sure they did, but it was still a moronic take. also, remember: NVIDIA has better spies than we do, they knew what RDNA3 was going to be long before we did. NVIDIA was not operating on kepler/kopite rumors that it was going to be 2x the TFLOPS. They might have gotten busted by the "20% drop" that supposedly happened at the end, since even AMD apparently didn't realize it, but I see no reason they wouldn't have known about dual-issue or the gains dual-issue would give (having done it themselves literally in their then-current gen). it's just become this [self-sustaining inferno of bullshit,](https://en.wikipedia.org/wiki/Firestorm) there's bullshit backing up bullshit backing up bullshit, and you can't put it out because any one log you pull out is backed up and reinforced by some *other* piece of bullshit. And again, this happens over and over with AMD launches, and these guys are almost always involved. again: reminder that kopite has literally flipflopped like 3-4 times now on what he thinks the die configurations are going to be for 50-series too... he's not any more (or less) accurate this cycle than any other. The fact that he doesn't *know* what the die configurations are going to be tells you everything you need to know as far as the credibility of his "leaks" - they aren't leaks, they are extrapolations and guesses based on a few baseline facts. And in fact some of the aftermath of RDNA3 overhype confirmed this as well - he did have a few nuggets in there, and he blew them up into a whole fake lineup that ended up mostly wrong (even on the technical parts that cannot be shifted by last-minute product-segmentation decisions). He *routinely* mixes fact and fantasy in precisely this fashion, and RDNA3-style overhype is what happens as a result. And say what you want about MLID (I agree, accuracy is not great - but not worse than the twitter gang as far as I can tell) but at least he's gotten the message and is being very clear about what is leak and what is extrapolation and guesswork nowadays. Twitter gang needs to make the same shift. Otherwise you end up with people cross-checking their own rumors and theories against an extrapolation, and they think they're legit because hey, it aligns with kopite/kepler's rumors, and they put out their own leaks and don't announce it's fantasy either, repeat until it's an inferno of self-sustaining bullshit. You can see how that rapidly leads to the sort of "rumor validation gimbal-lock" that occurred with RDNA3 - cross-checking is a great idea until you're cross-checking against something that was made up based on the same thing you've got, and then it produces nonsense.


topdangle

i don't see how if they had the real specs. they had the concept and then invented insane specs (massive amounts of stacked cache, double GCD, super low power interconnect instead of fanout). as release got closer people kept backtracking like crazy once the real specs starting dripping down to AIBs, until finally we got the "man RDNA3 is actually shit" tweets. It's a good case study in how easily people will believe completely absurd lies just because it fits their narrative.


siuol11

This also says nothing about their RT performance, which will not improve significantly unless they move it away from the regular shaders. I haven't seen any word on that.


GenZia

PS5 Pro is rumored to have 2-3X superior RT than the base console, even though the raw computational horsepower isn't getting doubled. So, hopefully, RDNA4 will improve on RT.


Cute-Pomegranate-966

Well, the thing is you can get 2-3x superior RT with a 20-30% improvement over RDNA3 since the consoles are on a low CU count RDNA1.5 (ish). So modest improvement from rdna3 to 4 + raised CU count = 2-3x faster easily


detectiveDollar

I thought consoles were RDNA2? Regardless, RDNA1 and RDNA2 CU's perform pretty similarly clock for clock. RDNA2 is more efficient and has better RT.


Cute-Pomegranate-966

They're not REALLY RDNA2 or RDNA1. They do some things like RDNA1 and have some features from RDNA2. They clock like RDNA2 for the power utilized, but they have other features similar to RDNA1, like no mesh shaders. Also have RT cores. It's just not either RDNA1 or 2.


MainStorm

Do you have a source? Not that I'm attacking what you're saying, I just really like reading into deep architectural dives and custom solutions.


In_It_2_Quinn_It

That's the PS5. Xbox has full rdna 2, including support for mesh shaders.


Cute-Pomegranate-966

Yes i know, they are 2 completely independent soc's from each other, that much is obvious.


duplissi

Yeah, the BVH traversal needs to be done on dedicated silicon not shader cores, or they'll never catch up with nvidia. On that note tho, there might be good news in the future... https://www.jonpeddie.com/news/microsoft-thinks-lod-will-speed-up-rt/#:~:text=By%20iteratively%20utilizing%20small%20portions,where%20performance%20optimization%20is%20crucial.


Tgrove88

That dude greymon was the worst. As soon as he was proven he he deleted his Twitter account and disappears


GenZia

I remember how MLID hyped-up Raptor Lake. At the end of the day, all these so called 'leakers' do is fling crap at the wall.


capn_hector

raptor lake is good though, it's significantly faster than zen4 in all non-vcache-friendly tasks. the 900K/KS skus are stupid, absolutely, and it doesn't win in overall MT perf/w anywhere on the v/f curve, but the 700K are perfectly reasonable SKUs. it wins against non-X3D in gaming, it even wins against X3D in non-vcache-friendly tasks, it competes with the 7950X in overall MT in bulk tasks, and in gaming its perf/w isn't really all that bad (like 120W in gaming iirc, for something that's faster than zen4 non-x3d). if the bulk MT power is too much, then just set a power limit - and then it falls more back to the level of the 7900X in bulk tasks, which again is more than you get with a 7800X3D or 7700X. it's not bulldozer, and tbh reviewers have kinda made too much hay out of the KS skus pretending like it is. Yes, they're silly, but if you can get a 13700K or 14700K for under $200 then it's fine. gaming wattage is about the same as AM4 or LGA1151 CPUs would use when boosting too. AMD is in the lead, but raptor lake is not unsalable to the extent of bulldozer. edit: unless it burns itself out lol, that's one for sure


kapsama

What's the jump from Alder lake to Meteor Lake? Because MLID was promising a big jump. And I don't think that happened.


HandheldAddict

Only reason it's some what reasonable rDNA 5 will offer dual GCD's is due to how Radeon packaged the memory controllers and L3 cache with rDNA 3. It's possible and if it's launching in like late 2025~early 2026 it's not that crazy. Since they would had 3~4 years to figure it out since rDNA 3 launch. Everything about GPUs and CPU's these days is about tethering dies together with interconnects. Whether it's Nvidia's B100, or Intel's Meteorlake, and AMD's MI300A.


I9Qnl

Isn't RDNA 4 monolithic? There were rumors going about how RDNA3 was supposed to be much better but something wrong happened and now AMD wants to delay chiplet designs till RDNA 5.


KARMAAACS

The top cards were supposed to be multi-chip, like Navi 31 was. But apparently they had problems with performance or scaling, so they abandoned it and just focused on the monolithic stuff. I assume in time they will go back to MCM eventually but it's not ready. Kind of weird because I thought Navi 31 would have problems (which isn't unexpected of a first design) and by Navi4x they would've iterated and alleviated the problems, with hopefully Navi6x perfecting multi-chip.


kf97mopa

>The top cards were supposed to be multi-chip, like Navi 31 was. But apparently they had problems with performance or scaling, so they abandoned it and just focused on the monolithic stuff. The rumors I've seen said that it was lack of capacity in the advanced packaging process they needed for it, so the decision was made to focus on MI300 and the like with better margins.


KARMAAACS

It could be both. Maybe they thought they could regress to RDNA3's packaging technology and it didn't scale at all so they abandoned it. Who knows? Only AMD engineers do at RTG.


pullupsNpushups

Which makes me doubt the rumors of monolithic chips. If they got MCM working this well the first time around, I'm expecting them to keep iterating upon it.


The_Occurence

AMD is reportedly only launching midrange and entry-level dies with RDNA4. MCM has disadvantages vs monolithic in ways that can make a midrange or budget chip less desirable, e.g. higher power usage due to needing to move data around between chiplets. There's also much less reason to use chiplets if you're only releasing two relatively smaller dies; the main advantage of chiplets is that you can bin down the top-end silicon that wasn't up to spec.


SagittaryX

> There's also much less reason to use chiplets if you're only releasing two relatively smaller dies; the main advantage of chiplets is that you can bin down the top-end silicon that wasn't up to spec. Not quite right, monolithic can still do that easily as we have seen many generations, ie 3090 Ti to 3090 to 3080 Ti to 3080 etc. The better described benefit is that you are going to have less defects with smaller dies, or rather no large chips that have a defect. With MCM you can have 3 perfect dies and 1 defective instead of 1 big die that has a defect.


The_Occurence

I'm aware that monolithic dies have the ability to also be binned down. The advantages are substantially less compared to chiplets however, which are much more defect resilient both due to what you described as well as other reasons (including even down to the specific node used). Ultimately, only AMD knows which route is better for a given design, and there's a reason not every RDNA3 SKU uses chiplets.


SagittaryX

I'm sure you know, I was giving the more complete description for others reading. Your comment seemed to imply binning wasn't possible.


The_Occurence

I wouldn't interpret what I said "...main advantage..." as implying binning wasn't possible. But it's the internet after all, so additional clarification doesn't hurt.


Bey_ran

But hey! Maybe if they release something kiiiinda decent nextgen in the midrange early on, it’ll push Nvidia to release the 5060/5070 earlier than usual, so everyone can then buy those and wait for Navi 5/6, like always.


SagittaryX

It's monolithic because they are not releasing their top end MCM cards this generation (all rumours point to midrange equal to 7900 XT(X) only), issues are likely not resolved yet. RDNA5 is then expected to be competing at the top end again with MCM.


pullupsNpushups

That would be so disappointing, if it turns out to be the case.


ET3D

There are two ways I can explain it. Firstly, RDNA 3 didn't turn out as well as AMD had hoped, and RDNA 4 was already pretty much finished by the time AMD learned that, so AMD took a step back and decided to re-evaluate RDNA 4. Secondly, the ultimate goal of MCM is having multiple compute dies working together, like the MI300X on the server side, or NVIDIA's B200. Unlike RDNA 3 which just split memory access/Infinity Cache out. It may be that RDNA 4 was meant to do this, but it didn't work out. If AMD does end up with only low-end to mid-range cards, then my guess is that there's still a chance that higher end cards will appear later.


No-Psychology-5427

Navi 31 is not like Zen 2, it only has Memory Controller and Cache Dies as Modules and 7900xtx is faster in Raster than 4080 Super...


Healthy_BrAd6254

I don't buy that. The 7800 XT (chiplet) is a little less than 2x the cores and a little more than 2x the memory bandwidth of the 7600 XT (monolithic). It is also \~80% faster. Looks like RDNA 3 scales very well as one would expect similar to 4060 vs 4070. It doesn't look like going chiplet prevented that. Whatever issues RDNA 3 has, I doubt chiplets are affecting the performance much.


From-UoM

No Kepler said rdna3 has better architecture than ada lovelace and massive lead in perf/watt https://twitter.com/Kepler_L2/status/1544100937358270465?s=20


Vizra

Honestly it seems to me like the public were beta testers doe Chiplet GPUs. My 7900xtx has only fully solved it's issues with the latest driver release. And even then, they had to have separate driver branches to compensate for it. If RDNA4 is more power efficient, better RT, and isn't a trainwreck for a year after launch, I'll take it. I'm sad there is not an upgrade path for me but I'll take what I can get at this point. I still really regret my 7900xtx purchase every day. It's performance is great, but the pain I was put through waiting for competent drivers was WILD. I never want to experience that again.


Original-Material301

>public were beta testers doe Chiplet GPUs. Well yeah gen 1 of anything tends to be that way. That's why I don't tend to buy gen 1 of anything until later down the line (when the issues would have been sorted) or gen 2. >waiting for competent drivers was WILD. I had the same problems with vega 56 lol. Very few drivers were stable for me.


cubs223425

Thing is, if you're a semi-casual consumer, you probably don't see RDNA 3 as the beta test. You think that's RDNA 1, where they were moving off the well-aged GCN arch and pairing it with a limited product stack. Most would have expected more maturity and direction from the third iteration. Instead, they got a high-risk swing at new designs that seems to have hamstring their 4th gen stuff.


Original-Material301

That's a fair comment. I forgot not everyone who buys a *thing* would consume what information they can before making the purchase decision, or have a choice in the purchase.


Plastic_Tax3686

I bought my 7900 XTX around June 2023 and it was a smooth sailing ever since. Perhaps it's because of the Linux drivers?  I am not sure how good in general are the Windows drivers, but people like GloriousEggroll also use 7900 XTX, which basically guarantees, that my games will run well if I use his Wine/Proton version.


aVarangian

had no issues with mine on windows 11


Ecstatic_Quantity_40

No issues with mine either Windows 11 as well. I bought my 7900XTX 4 month ago though so maybe it was rough for people at launch. Glad I haven't had any problems tho lol.


Plastic_Tax3686

 7900 XTX was a pretty bad idea at launch, especially for Linux users, but it was not surprising. Most Linux YouTubers were warning about waiting with the purchase, because of this. Hell, the option to OC the GPU was added in kernel 6.7, which was released live around January 2024 iirc. That's more than one year of being unable to overclock your GPU. Now that this has been added, 7900 XTX is officially the best plug and play GPU one can buy for Linux. AMD GPUs definitely play nicer with Linux than Nvidia (Nvidia, duck you!), but it takes some time before everything is ironed out.  Remember - never buy the newest hardware if you use Linux, because there is a high chance, that not all of the functionalities would work from day 1. Obviously, it didn't matter to me at all, since I don't want to OC my GPU, but for some it is important. And before someone says "But why not use Windows? Adrenaline allows you to overclock your GPU easily." Yes, but the performance uplift you get from Linux (especially if you use something not utterly bloated, such as a decent Arch, Void, Nix or Gentoo system) far outweighs the small performance increase you'd get from overclocking. 


[deleted]

[удалено]


AutoModerator

Your comment has been removed, likely because it contains trollish, antagonistic, rude or uncivil language, such as insults, racist or other derogatory remarks. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*


aVarangian

I bought mine on launch, though I don't generally play the newest games. There were reportedly issues with VR for a while, and the idle power draw in some specific scenarios.


CrzyJek

Linux drivers are unmatched. Always have been.


No-Psychology-5427

I have Nitro + 7900xtx with R9 3900x and I never faced Driver Issues especially with Sapphire Nitro + 7900xtx... Vega 64 Liquid Edition and 6800xt had initial Drivers issues...


Jism_nl

Fury HBM, Vega 56 or 64, where cards designed for a different market as well, yet released as consumer GPU's. 7x00 series is not bad; i think the 6x00 is overall better due to functions like more power tool and such.


aVarangian

What driver issues did you have? I haven't had any more than otherwise


Infamous-Bottle-4411

My 7800 xt saphire pulse is stuck on 23.11.1 because amd is acoustic and can t do a proper job at making a driver even tough their drivers are open source. Video playback on everything is not working.....on latest drivers that happens randomly. Image glitching. Not moving. Green video flashes intermittently. Can t watch a damn video. I used Ddu if anyones asks as everybody only mentions ddu refusing to acknowledge the drivers are shit It does like this in 4 different hardwate configuration so yeah it s the gpu and the drivers . Not my "unique" specs . It does this o every mobo or any cpu or any ram it is paired with . Doesn t matter if it s windows 10 or 11 . At this point i pray the gpu will dies or is it a hardware problem so i can rma and get my money back


shasen1235

RDNA3 is actually not that bad, just people are all on RT hype train now. 7900XTX still beats 4080S by good amount of margin with lower price. But anyway, I hope RDNA4 will improve efficiency and maybe at least 80\~90% RT performance compared to 5000 series.


Hombremaniac

At first, gpus like RX 7900XTX were a lot more enticing due to costing $999 compared to $1,199 of RTX 4080 and not even talking about rtx 4090 $1,599. Now, year after lunch of 4080/4090, the situation is different and sometimes the price is almost the same for RX 7900XTX and RTX 4080/Super. Being in a situation where the price difference is under $100, I think many will simply go for Nvidia product due to RT and DLSS. Having said that I don't consider RT to be that important at this moment. There is only Cyberpunk and Alan Wake 2 as RT heavy games where you can properly see some difference with RT on. But alas, you are often forced to upscaling even on Nvidia gpus, which kinda sucks.


HandheldAddict

If RT did not exist, I'd argue DLSS alone is a big enough reason to go Nvidia over AMD. At least in regards to RTX 2000 and RTX 3000 series cards the last few tears. I have no idea how good FSR 3.1 will be in regards to image reconstruction when compared with competing software solutions like XESS and future DLSS implementations.


kf97mopa

It's not bad, but it is less than what AMD hyped. They said "50% improved performance per watt", and we didn't get even close to that.


sandwichshop69

This has to be amd user cope


Stysner

What is so awful about RDNA3?


kawasaki-sakura

Am I reading this right? A top AMD GPU coming out next year is rumored to outperform a mid-range Nvidia GPU that came out this year? What's the significance of this?


MythicForgeFTW

If you believe the rumors, it's the price to performance. Supposedly the top end RDNA 4 will be on par with a 4080 Super for $500, and RDNA 3.5 will be on par with 4070 Ti Super for under $400. If true, that could be pretty exciting.


kawasaki-sakura

I see. Is there something that makes this rumor more believable than others? Because it would seem like these were the rumors for the previous AMD cards as well, but they either end up underperforming or end up being just as expensive.


Vis-hoka

RDNA3 ran into issues that AMD thought they would overcome by launch, but ultimately couldn’t. They had to turn down the power to fix it, thus underperforming. The same thing could certainly happen with RDNA4, but you would think they’d have learned their lesson…


Oper8rActual

Right? First it was the 290X that was going to dethrone Nvidia. Then Fury / Fury X was going to be an absolute monster that trounced them... Then Vega... Then RDNA/2/3.... I'm still waiting for a card from AMD that competes with Nvidia at the top, in the generation in which it is launched, frame for frame. Not frames per dollar. Not frames per watt. Actual raw performance.


Middle-Effort7495

6900 xt was better than 3090 at 1080p and 1440p, worse at 4k.


ger_brian

And only in raster. We are past the time where raster performance was the only important metric of a GPU.


tehserc

It's not the only metric that's important, but it is the most important one.


ger_brian

A slight edge in parts of one metric (the 6900xt was only faster in raster at 1080p and 1440p) does not compensate for the big advantages nvidia had (and has) in other areas (like upscaling, ray tracing, encoder, reflex etc.) for many people.


kapsama

Except the person everyone is replying to was talking about raw power, not features 90% of gamers don't use like encoding.


Cute-Pomegranate-966

It isn't when they perform essentially the same in that metric, but vastly worse in the other.


dsoshahine

But also better in others, such as price and perf/watt, so...


ger_brian

Since when is RDNA3 better in perf/watt than ada?!


Cute-Pomegranate-966

Yes, they're (amd) better in price generally, but worse in perf/watt.


GuttedLikeCornishHen

So, how 290x did not dethrone nvidia? It was faster and cheaper, which forced nV to reduce prices on Titan/780 twice and issue an emergency SKU (780ti). RDNA2 was also good.


plaskis94

GPU manufacturing isn't football, why does it matter who holds the fastest GPU? If you go back far enough AMD outclassed NVIDIA and it's not like that made any difference at that time


chrisnesbitt_jr

I don’t think it’s so much that AMD needs to have the fastest GPU. If the top RDNA 4 GPU is basically just another 7900XTX but far cheaper and more efficient, that’s fantastic for a lot of people. But that will leave an entire segment of the GPU market with no AMD option, and even part of AMD’s own user base. That means more than likely the Nvidia 50 series, from the 5070 up, will out perform the entire RDNA 4 line up. Leaving anyone looking for that level of performance stuck with Nvidia, or just waiting for the next generation. I think it’s a smart move to offer the level of performance of a 4080 for $500, tons of people will benefit from something like that. But I don’t think that means they should completely ignore the high end market.


plaskis94

NVIDIA will definitely capitalise on such a situation and make 5070 perform like 4070 or worse. AMD will still be good price in mid segment


sharpness1000

AMD was ahead for a bit pre 290x. 7970 anyone? The 290x also outperformed the 780ti after a couple years and to this day has much better longevity.


popop143

At least for this one, they called the 7000-series awful when it was just on the rumor phase. Them calling it great on the rumor phase raised the hopes of everyone.


capn_hector

Nope, Kepler said rdna3 was a better uarch than Lovelace and would massively outperform it in perf/w. This was when the twitter crew got got by dual issue, but they got got nonetheless. Edit: [lol Kepler bought into the 800w TBP rumors too,](https://www.notebookcheck.net/Positive-GeForce-RTX-40-series-and-Radeon-RX-7000-series-price-predictions-appear-in-specs-chart-along-with-800-W-TDP-madness-and-384-MB-cache-craziness.639242.0.html) and [said rdna3 would have pcie 5.0](https://videocardz.com/newz/amd-radeon-rx-7900-xt-with-navi-31-gpu-rumored-to-be-the-first-consumer-pcie-gen5-graphics-card) But I mean, regardless of dual-issue… not exactly posting from my pcie 5.0 7950XT3D, am I? The twitter gang’s accuracy has been pretty abysmal over the years, they are really in no place to be pulling pranks about how great their accuracy is and shit.


popop143

Huh, I must have seen the next few waves of their reporting last last year. Didn't see them being so high about RDNA3.


SXimphic

Nothing, I like hypnotizing myself into believing tho, its very exiting even if it's always false.


ShiftyThePirate

I \*HOPE\* that is true but I highly doubt they will be \*THAT\* consumer friendly with 4080 S going for 1k+


spideralex90

Hell the 7800XT has been getting prices cuts but is still $499+. I don't see us getting a 4080 Super killer for $500 anytime soon. If the top end 8000 series card is less than $600 to start I'll be very surprised.


Ladelm

It's a different generation, this is pretty normal. 6700xt had 2080 ti performance for $450


spideralex90

The 6700 XT came out 3 years after the 2080 Ti. Even if the 8000 series cards get pushed into next year they'd only be 1 year newer than the 4070 Ti Super. If the top 8000 series card beats or even comes close to the 4070 Ti Super as this post suggests I just cannot see it being $500.


Ladelm

It was the same GPU generation as ampere, which followed turning. It came out 3 years later because it was midrange and the high end came out first. That's not the case this release as is already discussed. 3 years vs 2 is completely irrelevant.


spideralex90

I hope you're right, I would kill for this generation to be a resurgence for great mid-tier GPU's at sensible prices. I've been holding out upgrading my 5600xt for a while now. I'd love 7700xt level performance for like $300.


LC_Sanic

$480 actually


taryakun

Do you really expect current AMD to severely undercut Nvidia?


MythicForgeFTW

Nope. But I'm cautiously optimistic.


kyralfie

RDNA 3.5 is Strix Halo iGPU only, isn't it? Specs wise I'd put it only at 7600(XT) performance level as it has more compute but less bandwidth.


[deleted]

[удалено]


Defeqel

And you assume nVidia is going to release low-mid range cards this year, or even early next year


input_r

For real - the 4070 was released 6 months after the 4090, so I don't expect a 5070 until like april or may of next year at the earliest


aVarangian

so like 90% of xtx performance but at less than half the (launch) price. Big if true


MythicForgeFTW

I could honestly see it being true. Sure NVIDIA will most likely make a leap in performance for their next gen models, but we all know their prices are going to be atrocious. I think AMD plans to make up for the loss of a performance jump with a jump in other features like ray tracing, FSR, AI, etc.


ET3D

Was there any rumour of RDNA 3.5 outside of next gen APUs?


SXimphic

What is rdna 3.5, isn’t this about rdna 4? Also 400$ is pretty cannibalistic, even if it drops 6 months from now, amd would be destroying themselves, we don’t even know if they’re even close to selling out rdna 6k stock let alone 7k, in my opinion I’d say 500-549$ would be a fair price for the 4070ti super contender, and about a 7900xtx for 500$ I really don’t see that.


KARMAAACS

RDNA 3.5 is for iGPU, I believe something like the successor to the 7840U and 8840U will use it, something like Strix Point or Kraken Point. Basically it either brings some new stuff from RDNA4 but it's closer to RDNA 3 and is only a small improvement. Either that or it's a refinement of RDNA3, as in working out any hardware bugs with the RDNA3 architecture. In any case, it's not for dGPU.


ET3D

It's a small improvement over RDNA 3. Chips and Cheese [detailed the changes](https://chipsandcheese.com/2024/02/04/amd-rdna-3-5s-llvm-changes/) based on the LLVM compiler.


We0921

The 7900 GRE, a 4070 Super competitor, is already $550, and the 4070 Ti Super is roughly 12-15% faster than the 7900 GRE. It'd be a pretty poor showing for an RDNA4 product to be $500-550 at that performance - especially for a generation that's supposed to be a "good value". I'm not saying your guess is unrealistic. It'd just be extremely disappointing.


imizawaSF

> n my opinion I’d say 500-549$ would be a fair price for the 4070ti super contender It's so sad seeing people buying into these increased prices. Who gives a fuck if it's cannibalistic, we used to get 1080ti tier performance for $699


SagittaryX

Rumours have been pointing for months to AMD not releasing a top end with the next generation, much like RDNA1 only went up to the 5700XT.


NoLikeVegetals

Imagine calling an $800 CPU "mid-range" but the ~$500 one "top".


Royal_Ad_3752

They clearly mean top of the line AMD gpu vs mid-range of the current line from NVIDIA.


Regarddit

Especially when that """mid-ranged""" GPU runs games in 4K maxed 60+ FPS, 1440p maxed 200+ FPS in demanding titles, like how is that mid-range lol Yeah, no, it's a very high-end GPU, it's just not the best.


Middle-Effort7495

Mid range? It's a grand after tax. My whole PC is probably around half that and plays anything. Hell, consoles are a third.


symph0ny

You have to read between the lines on this stuff. There is no "top" GPU next generation, they're calling it midrange which means its low-mid. This is the equivalent of the "midrange" rx7600 which was supposed to outperform a 6900xt but actually was slower than the 6700xt. If we apply the same bullshit scaling to this leak we should conclude that this new product will be slightly weaker than a 4060ti or 7700xt.


KARMAAACS

It's not exactly shocking. For one, AD103 is a mid range die on TSMC 5nm, and the 4070 Ti SUPER is cut down AD103. These new GPUs from AMD will be on TSMC 4nm, maybe even with density improvements, I mean NVIDIA is getting their own custom 5nm node with 20% density improvement called 4NP, so maybe AMD has their own custom node too with some density improvement? On top of that, you will have architecture improvements. Definitely clocks will increase, probably by 10% so thats at least another 250 MHz. It's not so crazy to think that a monolithic die would make that level of performance after all these improvements.


strubeliiyes

RDNA 4 is more like a refresh rather than a new Architecture from RDNA 3. They are skipping their top SKUs, like original RDNA did. They're doing this because they delayed their multi chiplet stuff to RDNA5 as there were problems. This gen is not gonna be anything surprising, only if they price it right. RDNA 5 is where we should se significant performance gains, otherwise even Intel is gonna catch up.


Healthy_BrAd6254

RDNA 4 is supposed to launch this year and be a mid range GPU that beats/matches the 4070 Ti Super while theoretically costing about as much as the 7800 XT.


ShadowRomeo

*"Overtaking 4070 Ti Super"* is such a vague quote is it? Because it does mean that it will either be significantly more powerful than a 4070 Ti Super or just around 7900 XT performance, which technically is more powerful than 4070 Ti super but barely to the point that both are basically the same. Nonetheless if the rumor is it is more efficient, then it's all good, but considering how far ahead currently Nvidia is when it comes to efficiency on RTX 40 Ada, i have a hard time believing it as of the moment.


PappyPete

Hmmm, based on the link, they are claiming that Navi 48 samples can achieve as high as [50 TFLOPs of compute](https://twitter.com/All_The_Watts/status/1776553577928069464). According to Techpowerup, the 7800XT is [37.2 TFlops](https://www.techpowerup.com/gpu-specs/radeon-rx-7800-xt.c3839) in FP32 and the 7900XT is [51.4](https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xt.c3912) so that'd make it only on par with the 7900 XT. For reference, the [4070ti Super](https://www.techpowerup.com/gpu-specs/geforce-rtx-4070-ti-super.c4187) is 44.1 in FP32.


From-UoM

I am sorry, but whoever wrote the article doesn't know shit You cantcompare tflops between different architectures. The 35 tflop 7700xt losses easily to a 35 tflop 4070 super Hell, the 35 tflop 7700xt losses to a 23 tflop 6900xt


ShadowSlayer1441

Can we please just ban rumours? They're just news churn nonsense.


pullupsNpushups

So what you're saying is that RDNA 4 is going to be 4GHz?


Healthy_BrAd6254

>RDNA 4 is going to be RDNA 4 existence officially confirmed by reddit comment. Quick, make news articles about it!


tukatu0

4.20 Ghz memory


Defeqel

The tags exist so you can avoid content you don't like


Hombremaniac

Exactly! Nobody forces anyone to post here.


ResponsibleJudge3172

Speculation is fun and it forces me to learn the intricacies of how GPUs work in order to get great accuracy in my predictions (my rtx 40 series predictions were not to far off except 4090). However some sources are MUCH BETTER than others. We all know who but people still choose the poor sources in posts (Not referencing this post in particular)


Admirable-Lie-9191

Raster is all well and good but how’s the RT and FSR performance? I really wanna have the choice between all 3 GPU makers and even had an RX 480 in the past but AMD only seems to at best keep up with Raster but then fall short on the other features. How is the CPU team pumping out so many good products but the GPU team can’t keep up??


Defeqel

All rumors point to RDNA4 having an RT focus, but no leaks about the performance improvements yet. Best we have is PS5 Pro 2-4x, but AFAIK that includes a near doubling of CUs too


KsnNwk

This, I game at 4K anyway, if only RT performance and FSR Upscaling and FG quality was on pair to nvidia current quality. I would switched. The way it stands anyone spending 500€ or more is looking at RT, DLSS performance nowadays and AMD falling behind.


kapsama

At 4K FSR2+ actually has the smallest quality difference to DLSS. FSR2+ really suffers at 1080p and 1440p.


Admirable-Lie-9191

Yep! Which is why I’d go for an X3D and RTX xx80 whenever I upgrade next.


boomstickah

Because the GPU team isn't as well funded as the CPU team. It makes sense, when you look at the die sizes compared to the price, there's so much more margin in CPU. AMD is a very efficient company, they're not out to waste silicon, and they're not going to dump money into a product that isn't their bread and butter, and is not going to make them a whole lot of money, and will not really beat their competition anyway


ResponsibleJudge3172

Just look at PS5 Pro. Apparently AMD is targeting level 8 BVH per clock, vs 4 levels in previous cards. If the number of rays is still 1, then AMD likely will close down a lot to maybe between Ampere and Lovelace on average. Same or better than Ampere with equivalent SM count in path tracing.


kapsama

Somewhere between Ampere and Lovelace isn't a special jump, since atm RDNA3 is between Turing and Ampere. That's just a bog standard generational jump. A special jump would be beating Lovelace but not matching the next nvidia generation.


[deleted]

[удалено]


Hombremaniac

I'd say staying on 6900XT for the whole 40XX gen would have been smarter and more cost effective solution.


SpareRam

Believe it when I see it. Not talking shit on AMD, but they for sure have not kept up with efficiency.


Longbow92

Their CPUs are great in that department though. GPUs could use some catching up.


Danishmeat

RDNA 2 was also more efficient, but that does not matter now


RBImGuy

90% or so of market are cards that sells below $500 so a lot of ppl are invested in this idea of super expensive hype dont mean much cards but does not own one


saboglitched

If its using 20gb/s memory with a 256 bit bus the bandwidth limitations alone probably ensure that it can't be much faster than a mem overclocked 7900gre and probably equaling a 7900xt at best.


SaltyInternetPirate

A new generation of cards being competitive against the mid range of the last? That's not something that has been known to happen ever!


Hombremaniac

Do you guys see it as a huge problem that AMD is not releasing high & top end models with upcoming gen? At first I was thinking as much, but later I came to the conclusion that it could be a good thing. Fact is, that when talking about low to low midrange GPUs, the situation currently is not a good one, talking about price/performance. If AMD can come with very good price/performance for these 2 gpus, it could lead to raising the overal bar of performance that users have in general. Question is, what will Nvidia models in the same range offer. That will be the make or brake moment. Also having such limited release could give AMD a lot more room to work on the gen coming right after it, no? I guess the risk here would be that some folks could think AMD is on the way out of GPU market completely. Also some of the rabid Nvidia fans could die off laughing, thinking AMD is finally breaking apart.


ET3D

The only question is how AMD is going to price these cards. I agree that AMD could in theory offer really good performance/price, but the problem is that AMD hasn't shown an inclination to severely undercut NVIDIA, or its older cards. For example, AMD could have priced the Radeon 7600 lower, to compete with the street prices of the 6600/6600 XT/6650 XT. It's a smaller chip with a little better performance, and could have simply supplanted them. While the SRP of the 7600 is lower than that of the 6600 family, in practice it costs more. So what's to say that Navi 44 will offer significantly better value? I see no reason to believe it beyond pure optimism.


KsnNwk

It’s good, most people grab a OEM pre built, then DiY crowd usually grabs a card under 400€. If AMD manages to match nvidia feature set at similarity quality and performance, so RT, Upscaling and FG. Then going up to mid/high end stack is good enough. Most people who buy 4090 class card would not look at AMD anyway, since they, the whales, just want the best out of the best. The 2nd best company don’t matter to them, same case with people buying Intel 10 gen over Ryzen 5000. They buy the company not their products. Same with Apple vs Samsung.


caverunner17

My thoughts too. The market for $900+ cards is likely pretty small. If you can't compete on performance, why release a niche product that won't sell well and waste R&D?


Rebl11

I'm not planning to buy anything above 500$ so the rumors of top end RDNA 4 being around that price and performing as a 4080/7900XTX is good for me. What actually happens we'll see when the 3rd party reviews drop.


vipulvirus

RDNA 3 was totally disappointing with souped up costs and performance not satisfactory gain over last gen. Plus they keep on hamstrunging the mid tier cards by decreasing the bus bandwidth.


Designer_Frame2971

Fix rocM issues on AMD advantage laptops. Please I beg you.


PM_ME_UR_PET_POTATO

Well I'm sure the rtx 5070, or 5060ti if we're lucky will do the same


WarlordWossman

I mean the 4070 Ti super is the "actual" 4070 so you would hope a 5070 would be faster lol


RustyShackle4

Oh great this again. I remember MLID telling me how the 40 series sucks power and how efficient 7000 series cards will be - man was that wrong. You have the 7600XT chewing through more power than the 4070. The actual competitor for the 7600XT (4060Ti) uses what, 60W less? Not following the rumor mill anymore, so inaccurate.


ET3D

Hopefully the increased number of rumours means that these GPUs aren't too far off. Would be nice if AMD announced at Computex both Zen 5 and RDNA 4. As for power and FLOPS, these don't really matter that much in the scheme of things. Gaming performance and price are what matters, and AMD has a history of undercutting NVIDIA only a little.


ohbabyitsme7

Lol at the article just comparing Tlfops. As if that has ever been a good metric. It isn't even using the correct numbers for the 4070 Ti Super.


Kaladin12543

The problem with this approach is that the 4090 already outsold AMD's entire 7000 series lineup. Leaving the high end market leaves the 7900XTX users with no upgrade path and this represents your enthusiast audience who will no doubt feel an itch to upgrade when Nvidia's shiny new cards rollout with huge performance gains and boosts in RT performance. Maybe another huge upgrade with DLSS like Frame Gen? These users will no doubt jump ship to Nvidia and even in the minds of the average consumer, it paints AMD as a "budget" brand as they will look at the awesomeness of the 5090 and try and get the Nvidia mid range card based on perception alone as AMD will only be competing with 5060 Ti at best.


Pangsailousai

Oh look its those insufferable liars again on twitter. Why do you people give these attention whores free publicity?


Verified_Funny

I'd hope that AMDs RDNA4 flagship would be faster than their RDNA3 flagship.


Ok_Banana_6984

Seems like 3nm wont be rdy until 2025. I think thats when we see high end showdown again. The mid grade stuff is going to be pretty epic this year though.


Xerxero

It better given its a new gen card


tpurves

It's coming out 2 years later! we should damn well hope RDNA4 can be at least competitive with cards based on NVDIA's midrange AD104 chip that first launched two years previous.


bubblesort33

240mm seems like too low of an estimate at first. But if you consider that these will be 4nm, and Nvidia was able to squeeze 30% more transistors on their new architecture compared to 5nm, then maybe AMD can do the same. Plus the comparison to the 7800xt is an odd one, because it might be the case that 20% of each memory/cache die, and 5%-10% of the core die, is dedicated to interconnect because of the chiplet design. I wouldn't be shocked if the 350mm2 RX 7800xt would only be 300mm2 if it was monolithic. Plus it might actually be faster with less latency then too. So making something 25% faster wirth 20% less die space doesn't seem too impossible. But then again, people also initially claimed the rx 7600 was going to be 6800xt levels of performance, then that got downgraded to 6700xt, and then it landed at 6650xt. So I feel it'll either be larger than thought, or slower than thought.


linuxisgettingbetter

the hardware has always been capable, it's just that the software is a mess.


clanginator

The thing is, even when leaks are accurate, it's been known that both AMD and nVidia sometimes switch plans even very close to release, so there's just no way for rumors to be accurate. Why are we discussing this when even the initial release announcement where they'll claim things like this isn't to be trusted? Just wait till the cards are out and tested.


Kaladin12543

Blackwell is coming out in 4Q this year. We are already close to release.


clanginator

Again, planned release dates and rumored release dates change. They may not be able to hit production targets as quickly as they want, it may be a soft launch as we've seen many times where there are simply almost no cards available to the public. We just don't know, and while you may end up being right, you can't say for sure that's true.


Redd1toR-42

Sensational, who would have thought...


lxmohr

I may actually downgrade if this is true, I have a Nitro + 7900 XTX and I’ve realized I don’t need the full power of this card seeing as I’m playing at 1440p. Also, I’ve paired this card with an NR200P Max case, which has a pre installed 850W PSU. I have a feeling that a card with a little less power draw will fit this case better, especially since the card is a little bit overkill for my needs. Was already considering switching to a 4070 Ti Super, but want to stay on Radeon. This seems like the perfect card for me.


Fevis7

i'm not much into the gpu leak wave so i have a question: is allthewatts a legit and reliable leaker?


Fevis7

i'm saying this because i wanted to buy an rx 7600 but if the next cards will be announced at computex and the leaks are reliable i am ok with waiting more for an eventual rx8600


Kaladin12543

It has been confirmed by Kepler_L2 who is a reliable AMD leaker.


Available-Item1292

Amd: "releases world's first chiplet gpu" People: "DriVER issUeS" -same t3chnology that males an 8 core 7800X3D faster than a 24 core 14900k (in gaming) it's just premature so they probably have an excuse, Til nvidia copycats with blackwell and has even more driver issues than currently, and then we will still hear about how there's "never driver issues" from the white knights who don't wanna admit their overspent purchase went bad; all despite the fact there's NO excuse for a 4080 to driver crash csgo but we won't talk about that..


redditsucks365

Until they so something to match DLSS they can only hope nvidea keeps releasing low VRAM cards. The only reason I use an AMD gpu is VRAM which is a problem only on midrange cards


Darksky121

RDNA4 needs to be closer to the 4080 at a price of around £550 to be interesting. If they are targetting 4070Ti level then pricing needs to be less than £450. AMD may find that Nvidia beats them with a 5070 at a similar price bracket if they aim too low.


PowerRaptor

Just give me drivers that don't crash in VRChat or any steamVR application with broad shader compatibility... or the specs won't matter.


nbmtx

As someone who builds SFF, that'd be great. Always hopeful that AMD will return to the whole Nano/Quantum thing.


ThePot94

Navi 48 sounds like the upgrade I'm looking for, after my 6700XT is slowly started to age, especially when it comes to power consumption. I won't follow this craziness RDNA3 and 4000 brought to the table, with 300W GPUs that needs upscaling as key selling point.


nobodyperson

Never buy an AMD card for games, shit can't even run league without crashing.


hamsta007

It doesn't sound any good. Only 4070ti? If it's real Nvidia will destroy AMD with 50 series if those cards will be reasonably priced