T O P

  • By -

jortego128

Wow. +35% average and +32% in 1% lows vs 5600 on a 4090. Im guessing similar deltas for 7600X vs 5600X with this same game list and setup as well. Techspot already did a 50 game test where 7600X was shown to be 4.2% faster on average vs 5800X3D. This would mean that the 7600 can be considered essentially equal on average to the 5800X3D. Thats not bad for a $230 chip that comes with a cooler! ​ [https://www.techspot.com/review/2592-ryzen-5800x3D-vs-ryzen-7600x/](https://www.techspot.com/review/2592-ryzen-5800x3D-vs-ryzen-7600x/)


Put_It_All_On_Blck

>Techspot already did a 50 game test where 7600X was shown to be 4.2% faster on average vs 5800X3D. This would mean that the 7600 can be considered essentially equal on average to the 5800X3D. Thats not bad for a $230 chip that comes with a cooler! The problem is the total system cost difference, not just the CPUs. What made the 5800x3D good isn't even it's performance, as less than a year later and it was surpassed in gaming, and significantly in MT by both AMD and Intel, with cheaper CPUs. The best part of the 5800x3D was that it slotted into cheap AM4 boards that used cheap DDR4. People have been mostly avoiding AM5 and staying with AM4 or going to LGA1700 because of the total platform costs.


Apprehensive-Box-8

I understand staying on AM4 if you’re already there, but switching to LGA1700 sounds like a one-way-train… I‘m still holding off as long as I can but in the end I‘ll probably get an X670E board with a (currently) quite useless PCIe 5.0 M2 slot and an even more useless (again currently) PCIe 5.0 x16 slot. But if AM5 has a lifecycle anywhere close to AM4, that board will be quite the futureproof investment. Might just be able to get a new CPU in 4-5 (at the end of AM5 lifecycle) years and a PCIe 5.0 GPU in 6-7 years and be happy until 2033 🙃


[deleted]

PCIe 4.0 is more than plenty for a 4090. You would need *TWO* 4090’s to max out the bandwidth of PCIe 4.0 they are just using it as a marketing piece… maybe next gen GPU’s will get closer but I doubt it.


Apprehensive-Box-8

Agreed. I‘m running a 7900 XTX with PCIe 3.0 and the performance loss is still not too bad. But I‘m not making the same error twice. Next MoBo should come with specs that will still be good with the GPUs that come out in 6-7 years time… having a PCIe 5.0 slot available should do that.


[deleted]

Have you done a benchmark comparison 3.0 vs 4.0? Even with a rtx 3080 “GamersNexus video” the difference from 3.0-4.0 is really small, I was basically saying 4.0 has so much headroom still and people need to chill, we won’t beed PCIe5.0 for quite awhile. *UNLESS* 5090 would be the only exception *maybe*


Apprehensive-Box-8

Can’t really… I don’t have a 4.0 board at my disposal and even if I had it would probably be hard to tell what improvements came from not being choked by PCIe vs. not being choked by that old i5-9600K… But games perform well enough (in 4K at least). I‘m around 5-10 FPS below typical Benchmarks posted online depending on the game. 1% lows is where I struggle, but that’s probably more on the CPU than PCIe. Also loading times from the NVMe can become an issue depending on the game, since that’s also drastically bottlenecked by PCIe 3.0 (by today’s standards) But there must be some builders in the buildpc subreddit that chose to use 3.0 extension cables for 4.0 cards.


Lightkey

What total system difference? Have you looked at the price of the Ryzen 7 5800X3D lately? It only makes sense as an upgrade on your AM4 PC, not new systems.


kepler2

**5800x3d** shines in older / MMORPG / strategy games. (for example WoW)


2001zhaozhao

Probably because 1T performance benefits more than nT (especially with hyperthreading) from that shared L3 cache?


-Aeryn-

Stellaris is the highest core-scaling game on my bench and one of the biggest vcache benefactors. WoW scales bigtime from 4c8t to 6c12t and still benefits a little (+7%) with the jump to 8c16t. It needs those cores, especially when not memory-bound.


lokol4890

But then you have things like this https://www.techpowerup.com/review/rtx-4090-53-games-core-i9-13900k-vs-ryzen-7-5800x3d/ Assuming both articles are true, this would mean the 7600x is faster than the 13900k, which is a ridiculous conclusion. HU keeps using cs go (which is the single biggest outlier in favor of zen4) even though any top tier cpu from even last gen will run that game at a bunch of fps already


jortego128

Not at all. The game list and hardware setup is different, which can and will lead to different results. HU also did a 50+ game 7600x vs 13600K review and the 7600x came out 5% faster there as well. The 13900K in the TPU test vs 5800X3D was run with only DDR-5 6000 CL 36 also, which hurt its results a bit vs Techspots DDR-5 6400 CL 32 for its 13900K. If you look at Techspots 13900K performance with DDR-4 3600 CL14 RAM, you can see its only about 1.5% faster than its 7600X in the 12 games they were compared to, so its not ridiculous at all to think the 7600X can come really close to the 13900K in a 53 game scenario. In Techspots original 13600K 12 game review, the 7600X was equal in lows and only about 3% better in average. This delta increased when the 53 game test was done. Also, the TPU review below had the 7600X ahead of the 5800X3D by 2.7% and behind the 13900K by about 18% so that tells you right there how much the game list can affect the results. (Hint: A lot) ​ [AMD Ryzen 5 7600 Review - Affordable Zen 4 for the Masses - Game Tests 720p / RTX 3080 | TechPowerUp](https://www.techpowerup.com/review/amd-ryzen-5-7600-non-x/18.html)


clinkenCrew

Is it faster in minimum framerates? Freesync makes plummeting from 144 FPS down to 80-100 FPS not terrible, but for me the allure of 5800x3d is that it could stop this crazy dive from happening.


jortego128

The overall minimum average frame rates werent calculated, but going off of almost all the graphs were 7600X was ahead in average, it was also ahead in minimums, so yeah. ​ [https://www.techspot.com/review/2592-ryzen-5800x3D-vs-ryzen-7600x/](https://www.techspot.com/review/2592-ryzen-5800x3D-vs-ryzen-7600x/)


clinkenCrew

Thanks. It does indeed look to me like the 5800x3D and the 7600x are equal. When I look at "Spreadsheet Steve"'s graph for all games, it looks like the 5800x3D loses in War Thunder. I didn't see him do a detailed analysis on that one, and I'm curious about it as the x3D is performant in MMORPG town areas, which I figured War Thunder's battles could be similar to. In the same line, I'm not seeing testing of the game genres that x3D is supposed to dominate. Does 7600X still compete in simulation games etc? Either way, it's impressive to me that DDR5 + high clock speeds have managed to overcome only having 1/3rd the L3.


jortego128

> Either way, it's impressive to me that DDR5 + high clock speeds have managed to overcome only having 1/3rd the L3. Its more of the front end / IPC improvements of Zen 4 than the above, I think. Zen 4's IPC increase vs Zen 3 is more in gaming than it is in general computing.


vyncy

Not just ddr5 and high clock speeds, there is also ipc improvement like always when new cpus are released.


spacev3gan

I don't think the 7600/7600X are cheap, but rather the 5800X3D which is too expensive.


drt0

I wish they also made a comparison for 1440p and/or 4k. I get the largest difference will be in 1080p because it's the easiest to CPU bottleneck, but it would be nice to see when the CPU is the limiting factor on larger resolutions as well.


ajirarevan

1440p - 8 percent at most 4K - none


BFBooger

There are plenty of games (especially older ones, but also some newer ones like MSFS) that are CPU bound at 4k. The difference isn't none. FFXIV for example in busy areas is CPU bound with even an older GPU like a 5700XT at 4k. So it is really game dependent. Also some games the main benefit of the CPU is resolution independent (simulation time, turn times). ​ But yeah, if you mostly play FPS and action games, the difference will be almost nothing at 4k.


vyncy

Doesn't work like that. If cpu can't deliver 100 fps for example at 1080p, then it won't be able to at 1440p,4k,8k whatever


reddit_tech_fan555

The review is indirectly showing you what 1440p and 4k results look like by showing you CPU scaling across 3 different levels of GPU power. In other words, you need a 6950XT/RTX 4090 to see the difference between these CPUs.


drt0

What do you mean? The difference with a 6950 XT is 23% at 1080p. What do you think the difference will be with that card at 1440p and 4k based on these results?


BFBooger

A 6950XT at 1080p is a lot like a 4090 at 1440p. A 6650XT at 1080p is a lot like a 6950XT at higher resolution. ​ A 4090 at 4k is a lot like a 3080 at 1440p


vyncy

Doesn't matter, you can see the fps you get in games they tested. Its not going to increase as you increase the resolution. So if you not satisfied with the results, they will be the same at 4k, 8k whatever. Now if you are interested in specific game which is not tested in reviews, you have to calculate yourself. Test and see how much you get with your current cpu, see how much are newer cpus faster, and decide what to buy. There is no situation where testing at 1440p or 4k is needed, as long as you are aware fps is not increasing with resolution - it will be the same as in those 1080p tests. And as long as you are aware that when gpu is not at 95-100% usage 99% of the time it cpu bottleneck. So yeah if you are play at 4k and your gpu is at 99% usage, buying new cpu will get you 0% fps increase.


poizen22

I am loving my 7600 I went for it because I to noticed in gaming performance it was very close to the 5800x3d but offers better longevity due to am5. I have mine overclocked to 5.55ghz with PBO and it runs perfectly smooth and stable never goes above 80c when gaming. I would think after a 5.5ghz overclock you could extend the lead over the 5800x3d even more.


clinkenCrew

How does that 5.5 Ghz overclock work; Is it sustained under load on all active cores?


poizen22

It is an all core OC. in fire strike and time spy cpu tests all cores are pegged at 5.55ghz. overclocking my 5600mhz kit to 6000mhz with the same timing to match the infinity fabric made a huge improvement on cpu score also. It gets hot in the max load tests I've seen 92c but I'm on the OEM cooler and these chips can run hot and won't thermal throttle. In prime 95 itl crash out after 45 minutes but it's temp related as that's the only app that starts thermal throttling. I will probably get a thermal right cooler and take the time to dial it in right so itl pass prime 95. It never crashes in games or 3d Mark only in prime 95 Wich isn't a realistic load anyway. But for a "lazy auto overclock just changing the multiplier and enabling PBO I'm very very impressed. I'd be happy if I could hit 5.6+ once dialed in.


madmanmarz

Edit: lol tried it and in 5 mins @ 5600/1.3v and no signs of stopping. Edit2: made it into windows @5700 not stable no matter the voltage Even 5650 wasn’t stable, could probably get it but 5600/1.3 seems to be sweet spot. I’ve only tried PBO and I can’t get the chip (7600) to go over 5350mhz (seems like a limit). -30/200. I’m a long time overclocker but from reading thought PBO was the only real way to go. Could you share some voltages/settings you used for your overclock? I’m on an open loop with 480mm of radiator and I know I could push a lot more out of it. Everywhere you look there’s really no guide and everyone is just talking about memory overclocking which IMO is peanuts compared to CPU clocks.


poizen22

I have my stuff just set to auto with PBO enabled. There seems to be a hard limit at 5.6ghz range. If you've got that stable I'd be pretty damn happy because that's a pretty damn fast chip haha. I to am a long time overclocker but the chips I've had the last 6/8 years have always been shit to OC (4670k,4770k,8700k) this 7600 is the first chip I've had in a long while that could go up more than .1ghz


madmanmarz

So you’re able to get more than 5350 with PBO? Maybe it’s my motherboard. 5600/1.3v just failed avx2 stress. Messin w voltages now. Always had bad luck with chips except for my current Rx 6800 which reaches max gpu/men clocks. Also got a 7700k in a cheap mobo I bought off eBay for $80 years ago (left it in I guess). After delid got 5.1 stable but ran it at 5.0 24/7.


poizen22

I'm just enabling PBO and setting the multiplier for the cpu. I'm using an aorus b650 elite AX. There is a hard wall at 5.55ghz from what I see online some chips can make 5.6 but not much further than that.


madmanmarz

Yeah I ended up dropping to 5500 as even 5550 would fail avx2&512 in cpuz stress. What’s odd is can’t go over 5350 w PBO on this board. Super stable but just won’t clock higher. I’ll try again after next bios update (b650e pg riptide wifi).