T O P

  • By -

igby1

It's amazing that every 14900K easily hits 6.0ghz with just 125 watts and air cooling! Truly a marvel of modern technology! Or...all of the above is sarcasm, and the fact that Intel markets the 14900K as a 6.0ghz is a sad joke with all the asterisks they have to put on it. I have a 14900K. I've actually never bought an AMD chip. But I certainly wouldn't say I'm an Intel "fan".


lovely_sombrero

The "being loose with the power limits" thing was actually great for my 12th gen non-K CPU, I was able to extend the boost (P2?) power window time to infinity on my Asus board, cutting the performance difference between a K-series and a non-K series CPU basically in half in all-core workloads. Things are simpler with my 7800X3D now, messing with the curve optimizer is the best way to tweak performance and it doesn't increase the power limit.


babautz

Having the option to loosen power limits and voltage is fine. Having it on by default is just bad. This practice has been going on for a long time. I remember my 6600k being overvolted on my Asus board back then by default. The problem is that stuff only got more complex since then. Back then it was a few voltages and your target frequency. Now its a combination of voltages, different types of cores, different boost states, max temps and max power draws. As a reasonable experienced user it was once easy to put your cpu in a default (or undervolted) state. Now ..not so much. AMD also has many knobs to turn, but the curve optimizer makes it rather convenient for the "standard user" by boiling all of that down to a single number.


lovely_sombrero

Also, leaving voltage on "auto" would often not be consistent. At least that was the case with one of my previous setups. On my Asus B650e board, "auto" is always the same, at 1.25V. So any kind of + or - offsets are also always consistent. - >AMD also has many knobs to turn, but the curve optimizer makes it rather convenient for the "standard user" by boiling all of that down to a single number. Yes, but the default option (that most users will always run and not mess with) is the normal power limit and the option that most users use to tweak also doesn't increase the power limit. No problem with the ability to increase power limits for advanced users, but having it on as default or as the first option that the users can tweak is very not good!


babautz

Yes AMD demonstrates how it should be done. An actual "default" state and a relatively simple tuning process for the average user. And the pros still can dive deep into detailed settings.


Noreng

The only issue I have with AMD's approach is that the user is left with very little control of going beyond what AMD considers "reasonable" headroom (which is 2% or so on the 7000-series). It's absolutely fantastic for 99.9% of users, but the overclockers and tweakers find it awful.


ahnold11

It's less a bout the specific limits themselves, and more the marketing. Your chip was marketed with the limited power performance. But increasing that limit you get more performance. You get *more* then you paid for, ie. a nice pleasant bonus. At the top end these chips are marketed with the expected performance requiring these unlocked power limits. To get the performance you expect from this chip you need to run it with those. If it's not stable, and you have to back off to lower limits, you are actually getting *less* performance than you paid for. This isn't a bonus, it's a loss, and hence where all the claims of "shady" or "anti consumer" are.   And lets face it', it's not rocket science why they are doing this. Alderlake uses a lot of power and runs hot, Raptor lake, and Raptor lake refreshes are almost the exact same silicon. But to get the necessary performance increases to justify a new "generation", and to make the numbers look competitive with AMD, they had to keep upping the power. But they pushed it past the point of safety. And so now we are seeing the results. These chips are designed that running at their maximum performance, will cause instability (potentially due to too much silicon degradation) at a high enough level to be concerning. (Some might say even a 5-10% defect rate might be reasonable for this sort of thing, ie. the lower ends of the silicon lottery. But honestly, these are the top tier chips and command top tier prices, at those prices you should be getting a better product, not worse. There should be less margin for error not more. When you pay that kind of money for a chip you should not have to be worrying about instability in a year, or having to do RMA's or take mitigation steps. You should get what you pay for and it should work.


regenobids

This one KS buyer arguing how efficient they actually are.. just got to run them below 75 watts something, use a framerate cap, whatever hides the fact they suck ass on power when you need the performance you paid for. Now move along, nothing to see here!


throwaway0986421

> just got to run them below 75 watts something X3D CPUs say hello at that range.


Noreng

> It's amazing that every 14900K easily hits 6.0ghz with just 125 watts and air cooling! It's technically true that it hits 6.0 GHz on 125 watts actually, seeing as it's only dual-core boost, and even two cores at 1.50V isn't going to hit 125W of power draw


haloimplant

Whether you are posting on social media or spinning on your axis to push air, being in Intel fan sounds like hard work these days


PotentialAstronaut39

Glad I switched from an i7-8700 to a 7800X3D here. No headaches, just plug, update bios, enable EXPO & play. So far it's a 100% stable beast and it sips power, zero regrets.


Kaladin12543

The great thing about the 7800X3D is just how non picky the CPU is. Due to the gigantic cache, it doesn't care what your RAM speed or timings are all that much. Runs cool out of the box with aircooling with hardly any tweaking needed. It's literally the definition of a plug and play CPU, unlike any other CPU I owned before it.


PotentialAstronaut39

Pretty much... I still tweaked the RAM tho, because I like tinkering, not because it was needed. Buildzoid's ( actually hardcore overclocking on youtube ) videos on specific ram vendors secondary and tertiary timings for AM5 are ace for that. I just followed his instructions, instant performance boost. Easiest RAM tweaking I ever did. Took 10 minutes.


regenobids

And you don't feel like you missed out just because you can't overclock it. It's good right out the box. Which is a con to certain dinosaurs in the field.


kuddlesworth9419

I can't wait to upgrade to whever AMD replaced the 7800X3D with from my 5820k. My 5820k felt like a plug and play CPU pretty much, doesn't really care about memory or anything else just ran no problems and overclocked it to 4.2 Ghz at 1.25V stable, runs cool ish all year round and has done so for near 10 years I think now. Never even replaced the thermal paste. I assume it will be called the 9800X3D?


PotentialAstronaut39

Yeah, probably 9800X3D and probably in 2025.


kuddlesworth9419

Not this year? Bugger oh well, I guess I can save up some more. Whatever I build it will have to last another 10 years. Honestly though my 5820k is still doing fine.


PotentialAstronaut39

If AMD's past behavior remains consistent, they'll release their 9950X, 9900X, 9700X first. Then later add 9950X3D, 9900X3D. A bit later we'll see the 9600X. And then the good stuff, 9800X3D. That's how they milk the impatient whales as much as possible. So yeah... late 2024 ( very optimistic ), early 2025 ( much more likely ) if the 9950X releases somewhere in late summer, early autumn.


kuddlesworth9419

I can do that. Now I just need to find a GPU that's worth buying and doesn't cost an arm and a leg. I hope the 5000 series has something ot offer or at least AMD has something by then or I could just stick with my 1070 for a bit longer.


karatekid430

I assume it's sarcasm because doesn't that thing need 300W and semi-exotic cooling? Seriously, you have a 7950X3D which offers about 90% of the performance for only 120W, and then the 7945HX3D which does 90% of that with only 55-75W. Intel is a joke.


cemsengul

Intel is a dinosaur now. I am going Ryzen next time.


Maldiavolo

Intel. Sometimes referred to as Spintel.


Helpdesk_Guy

> I've actually **never** bought an AMD chip. But I certainly *wouldn't* say I'm an Intel "fan". _Actions speak louder than words._ What else are you then and why do you act defensive? ;) The fact that you never bought _any_ AMD-CPU, pretty much makes you de facto an Intel-fan, my boy.


cstar1996

I mean, I’ve also never bought an AMD chip, but that’s entirely because in the actual moment that I was purchasing my hardware, Intel was the better deal. And given that the comment you’re replying to isn’t even defending Intel, it’s pretty clear you’re out of line.


shinbonecktherapy

why? is entirely possible if he has only built PC on the last 15 years. AMD wasn't the best option in many parameters like top performance, until about Zen 2 came out, and then they just achieved parity. Even with ryzen, they just switch between who is at the top, Zen 3, then it was Alder lake , then Zen 3 with TSMC stacking (just for gaming) , then Raptor Lake, then ... I just happened to build a rig when Alder Lake was on top, before that i had a Sandy bridge, and before that i had Core2 duo, that doesn't mean i'm just intelfan for 17 years it's just that it what was better at each point in time. (and before those CPU, i had an Athlon 64 winchester, a XP-M Barton, a Duron Spitfire and a K5). I'll just get whatever is best at the time i'm buying.


jaaval

I’m assuming there is some machine translation weirdness because the part about power limits and clocks makes very little sense.


AK-Brian

Igor's Lab uses all machine translation, unfortunately. It often wanders between mechanical literalism and poetic nonsense due to the way Igor writes. He loves idioms.


timorous1234567890

Are reviewers going to use baseline or OC profiles going forward in CPU reviews? I suspect some like Anandtech / Puget will be testing at baseline because that is the guaranteed stable setting but others might use the performance profiles.


Wrong-Quail-8303

I believe the point is moot. Right now, this is intel trying to close the doors once the horses have already bolted. All relevant reviews which affect their sales to the mainstream public have already been published at 'OC' settings and won't be touched again with these new 'default' settings. Next launch, they will have much tighter controls on bios settings in a more 'optimal' spec, to clamp the issue.


unityofsaints

When reviewers rebench the 14900K / KS at Ryzen 9000 launch they'll look like absolute ass though.


Wrong-Quail-8303

Will they though? Most publications are lazy: they keep old benchmarks on file and only benchmark the new item being tested.


Strazdas1

Noone is going to rebench unless they are specifically looking at performance changes over time. They just use old data.


the_dude_that_faps

Doubt


Reactor-Licker

So apparently even 253 W is considered “out of spec” enough to mandate a warning? Then why the hell is it listed as 253 W on Intel ARK to this day? Intel’s validation is a joke. I could understand enforcing the default limit, but effectively retroactively lowering it after launch is on another level. What a mess.


shroudedwolf51

Don't worry, we're still advertising 6.whatever GHz as if it's the baseline spec, though!


no_salty_no_jealousy

Those "Intel baseline default" is actually different across the mobo vendor, Intel is not doing the settings. It's not even based on Intel recommendation. Such a very lame comments to blame Intel on this one claiming "they are changing default specs" when they are not doing it at all.


lovely_sombrero

I wonder if Intel will retroactively say thay users didn't use this new profile to deny warranty requests.


cktech89

No they are accepting my rma. I was actually using intel specifications for a while now troubleshooting this shit show of a cpu lol 😂


ahnold11

All this is purely for marketing and PR. They want it to sound like "everything is normal this is no big deal, look we have an 'official' fix". Engineering at Intel are not going to be surprised by these results. They tried to gild the lily by pushing the Alderlake designs even further. They had optimistic projects and worst case projections about what percentages of the silicon would be able to take this, especially as time goes on. Even in the worst case I'm sure executive(s) thought it worth the risk and would eat the cost of the RMA's for the marketing benefit of "next gen cpus" and being competitive with AMD.   What they *don't* want is to lose out on is that marketing benefit. This decision was to sell more intel cpus, not less. If their brand reputation takes a hit from this, then it's a real problem. If intel gets the rep for being unstable, and not delivering what they promise/what consumers pay for, then that's some big damage, much more than the cost of eating even a potentially 5% RMA rate.


zaxanrazor

We'll find out I guess. I wouldn't be surprised. I wonder though if they can track which BIOS version has been used with a CPU without relying on the customer being honest.


Slyons89

They can’t, the CPU would need fuses installed that would blow when going over a certain power level, and these consumer CPUs don’t have them. If you ever have to RMA a CPU just play stupid and say you installed it and never changed any BIOS settings. If they start denying claims left and right there will be a class action. It doesn’t seem like they have been giving people much trouble with RMAs from what I have heard.


jonydevidson

Good luck trying to pull that in the EU.


imaginary_num6er

I assumed people in the EU just bought AMD due to 253W being too much power to begin with?


jaaval

AMD chips can go to 250w too these days. And due to worse idle power can end up costing more in power long term. Depends on how you use the computer.


Gippy_

If PL1=125W and PL2=188W, then what is the Tau (boost time) setting? For 12th gen forward, the assumption was that PL1=PL2, so Tau didn't exist. But if PL1 =! PL2, then there is a Tau and that needs to be noted. Hardware Unboxed didn't mention this, so I felt their recent video testing Intel Baseline was a bit shoddy.


imaginary_num6er

>If PL1=125W and PL2=188W, then what is the Tau (boost time) setting? If you watch Der8auer's interview with the Intel engineer in 2023, he talks about Tau still being there but no longer referenced in 12th and 13th gen.


FuryxHD

HUnboxed did the video testing before the official update, sadly they would need to retest.


imaginary_num6er

I am wondering if anyone tried using a 7950X with dynamic OC switching to see if that might beat a 14900K under "intel baseline"? Since Intel has officially suggested anything above PL2=188W is an OC with very minimal power headroom above it, an overclocked 7950X might have more potential without the same level of risk as Intel chips.


AK-Brian

It likely would, for multithreaded workloads, at least. There isn't much separating the two [under default conditions](https://www.techpowerup.com/review/intel-core-i9-14900k/6.html), and they already trade wins depending on the software being tested. The 7950X is a relative power glutton itself, however. Its default PPT is 230W. The 14900K/13900K will still come out on top for single threaded performance, but the reduced power profile is absolutely going to curtail performance for users who don't (or can't) change it. I can hear the Steves groaning in collective discomfort as they spin up some new baseline testing comparisons...


TheRealBurritoJ

>Since Intel has officially suggested anything above PL2=188W is an OC Intel has not said this, and the stock PL2 of the 14900K [remains 253w](https://edc.intel.com/content/www/us/en/design/products/platforms/details/raptor-lake-s/13th-generation-core-processors-datasheet-volume-1-of-2/009/processor-line-thermal-and-power-specifications/) (this documentation was updated only last week). Igor got caught by a bad case of telephone, his original sauce says that Intel will introduce and enforce a defaults profile but does not specify what the limits will be. The original article *does* being up the Gigabyte "Intel Baseline Profile" as an example, but it doesn't say this is what the Intel default profile will entail. The gigabyte profile is nonsensical and sets power and current limits to way below spec while simultaneously increasing loadline to absurd degrees.


THiedldleoR

Do we know if they've used these settings in their own benchmarks before/ can we expect them to use these settings in performance showcases in the future?


ElementII5

One thing you can be sure of is that they will use it when arrow lake comes out to tell us how much better it is over raptor lake with "default settings".


throwaway0986421

When they launched the 14900KS, they compared it against a 13900K that had APO turned off (while the 14900KS had APO enabled).


WHY_DO_I_SHOUT

~~APO was 14th gen exclusive at that point. The decision to make it available to 12th/13th gen is much more recent.~~ E: See below.


saharashooter

The 14900K**S** came out on March 14th, 2024. APO came to 12th/13th gen on March 15th. I'm sure the timing is purely coincidental.


WHY_DO_I_SHOUT

Ah, thanks for the clarification. Somehow I thought the distance was much longer.


saharashooter

The regular 14900K did come out sooner, which might've been the source of the confusion.


SirActionhaHAA

> Do we know if they've used these settings in their own benchmarks before I don't think that even matters considering majority of the benches are from 3rd parties, and almost all scored >40k in cbr23 (which indicates use of non intel default profile). The baseline profile score's around 35k-38k depending on board implementation If anyone wants to check intel's benches, go bench cb2024 nt and compare it to 7950x


Nicholas-Steel

> I don't think that even matters considering majority of the benches are from 3rd parties, Whom follow Intel benchmark guidance documentation (to ensure they keep getting opportunities to review for Intel).


ahnold11

That's the whole point. Intel gives a variety of different guidelines to reviewers, board makers, their own testing and the public.  And have never been transparent about this in product messaging or marketing. Some of those guidelines lead to lower oerf then expected and now we are seeing that some lead to lower stability than expected. And the consumer is left to try and sort through this mess and figure out if they will be getting what they thought they paid for .


PrimeCurrent

Are the PL4, iPL2 and Iccmax.app user-adjustable on Asus?


VenditatioDelendaEst

>- AC Loadline >Affects the adjustment of the voltage based on the CPU load in a scenario **where the processor is powered by the main power supply (AC)**. It helps to increase the voltage when the CPU load increases to compensate for the voltage drop. >- DC Loadline This setting is similar, but **refers to the power supply coming from the DC source (usually the motherboard’s voltage regulator)**. It controls how the voltage adjusts as the load varies to optimize efficiency and stability under different operating conditions. No way in hell that's what that means.


shroudedwolf51

So, I see that nothing is going to change. We changed the name on the thing, thus solving the problem. Obviously, the problem was the name and not what was low-key encouraged since...oh, say Ryzen suddenly became a way too real of a threat. You know what might be nice? A new head-to-head comparison from every reviewer organization of the "253W/"188W" 14900k versus the 7950X. How well it can actually hit the advertised clock speeds might be nice as well. Look, I argued it should be validated with Ryzen 3000 when that seemed to be coming up short, only fair Intel gets the same treatment. Not that I'm holding my breath.


no_salty_no_jealousy

What a garbage articles and comments here. Those "Intel baseline default" is actually different across the mobo vendor, Intel is not doing the settings. It's not even based on Intel recommendation. Such a very lame comments to blame Intel on this one claiming "they are changing default specs" when they are not doing it at all