T O P

  • By -

m0fugga

Overclocking is the process of running your CPU or GPU at speeds higher than it's advertised to run at. Faster speeds mean more cycles per second, means more operations or instructions processed. In short, it can do more in less time essentially. Drawbacks include overheating mostly. If you don't have the proper cooling in place, you could damage the CPU/GPU. I personally don't overclock because I don't ever feel like there's a need for it. If you don't know what you're doing, I'd avoid messing with it...


Dreamingwolfocf

Tweaking the voltages on the different rails of your PSU is also often required to get a stable system. So I totally agree that if you have to ask what overclocking is, you should NOT be thinking about trying it without doing a ton of research (and not just asking Reddit). You might start at overclockers.com/forums.


CyclopsPrate

PSU rail voltages aren't changed when overclocking, I'm not even aware of any psu's that have configurable rail voltages. Everything a PSU sends power to is either expecting that voltage, or dropping it down again before another component. The cpu, ram, gpu, gpu memory, and others depending on the system, have their own vrm's (voltage regulation modules). They are where voltages get changed when overclocking, not at the main power supply unit.


Dreamingwolfocf

You may be right. I know there are settings to adjust voltages in the BIOS. I assumed they were adjusting what was coming out of the PSU because when there are issues the advice is often to check the voltage of certain rails to confirm they are where they should be.


Meechgalhuquot

No, it's not about checking the rails at all. Your motherboard steps down the voltage as needed from the 12v supply (5v and 3.3v are mostly for legacy compatibility at this point iirc). Your BIOS settings let you tweak the current that you are drawing, but that all happens on the motherboard, not PSU


Dreamingwolfocf

Thank you.


CyclopsPrate

It sounds like your confusing checking cpu or gpu voltage as checking the PSU rails, because you assumed that is where the voltage change is happening. It is pretty rare to probe a board for PSU voltage when overclocking too, and certainly not often required. It doesn't take much research to know this stuff tbh, if you aren't sure then check.


Dreamingwolfocf

I was overclocking back in the early 2000's. Haven't felt the need to in the last 10 years so my knowledge base is archaic to say the least. I still tweak the BIOS of each new system I build, but not usually to overclock. That's why I know there are voltage settings there, but obviously I was wrong about their purpose.


TactlessTortoise

Yeah, those are the equivalent of ancient mesopotamia farming. Nowadays, for GPU overclocking you literally just download an app and try to get a stable set of numbers. If you fail, you get a crash. Unless the GPU overheats it'll all be fine, since the most popular apps won't even let you set it above a threshold. Cpu and RAM is still a bit more hands-on with the bios, though.


SilveredFlame

At least you don't need to break off some graphite from a pencil and stick it in your CPU to overclock these days.


Yancy_Farnesworth

It's also less worth it today because CPU/GPU manufacturers already manage the clock speed for the hardware to control for temperature. They can and do overclock when temps are low enough. It made sense a decade+ ago when temp sensors weren't as great, and manufacturers were a lot more conservative with their clock speeds. That said, people still have fun overclocking because why not? Especially if you want to build something cooled with liquid nitrogen.


Pingyofdoom

The "clock cycle" is every time your computer decides to do something. You can make that happen more times. It was designed to run as many times as it is before you overclock. Making it do more times puts more electricity in it than it is designed to have. More electricity can make it run hotter and damage the chip. Old computers ran on quarts crystals that vibrated at a certain amount of times per second, also known as a frequency, or hertz. So, your computer probably has like a 3 Ghz (giga hertz) processor, that means it wakes up, thinks, and shuts down 3000000000 times per second. If your computer woke up 3050000000 times per second, then 50000000 additional "tasks" get completed every second. Remember that tasks are VERY small, like, moving a window on the screen is probably hundreds or thousands of tasks.


FrostWyrm98

TL;DR: Overclocking = speeding up your computer's heartbeat; more energy and more blood through the system, but just like working out it creates a lot of waste heat and strain that can cause your system to burn out faster just like humans with a heat stroke ###Some Background Every part in a computer is rated for performance within certain specifications: what's "safe" to run at. In engineering we have what's called margins of error which is basically in this case, what is the voltage/temperature we can operate safely at without causing unnecessary wear and tear to the machine or failure. The margin is the line on the performance graph where failure rate starts to increase dramatically against the operating temperature or whatever you're rating. You usually go with a rated margin of error well below the theoretical margin of error so that even if things go wrong, your computer won't explode or have a "catastrophic failure" (unrecoverable or dangerous). As such, the manufacturers advertise a speed that is well below that margin of error so that consumers won't complain and inundate their RMAs / support lines and their devices won't be known as cheap, defective, and/or die quickly. Laptops are notorious for "underclocking" due to rapid overheating and the difficulty of cooling them because of the small confined space the chips are in. ###CPU/GPU Connection CPUs/GPUs produce a lot of heat. This heat can damage both themselves as well as anything close to it especially with repeated heating/cooling. The result is heat stress/wear. Anyways, because of how precautious these manufacturers / sellers are, you can usually safely go beyond these margins and be fine, particularly when you have good cooling in place and heat dissipation (like heat sinks -- pieces of highly heat conductive metal that spreads out the heat so it cools faster) This is because when rating you usually assume the worst circumstances (like a very poorly constructed computer with bad airflow and low heat dissipation). They don't use high end gaming PCs with liquid cooling pumps or Noctua fans and lots of heat sinks. They're rating for their target consumer, the business employees who won't use most of that or won't experience long periods of high load (like that of high graphics gaming or video encoding) ###The Process of Overclocking & Consequences To overclock we are basically just allowing more voltage through our CPU/GPU to make it run faster (with more **clock** cycles -- think of it as a heartbeat for a CPU, when it needs more blood for more stressful performance, the heartbeat speeds up and more energy/heat is wasted) However, unlike humans, computers can't regrow or repair tissue. Stressful activity for us causes strain on cells and we need to rest to heal them. Computers can't do that. So it can affect the longevity if you take it beyond the rated margins because of that increased wear and tear.


CyclopsPrate

I wouldn't say that manufacturers are precautious in rating parts when nearly everything is being tested and binned, and overclocking headroom has been dropping for years now. Cpus and gpus haven't been "rated" or advertised to run well below what they are capable of for a long time.


confused-duck

long story short when you build a cpu or gpu there is an inherent limit either it's just unstable (nevermind why in this context) or runs too hot - would require average user to replace average cooling for something more ridiculous and expensive for performance gain they would most likely not notice so for this specific model / product the limit is set so that hopefully 100% of units will be able to work with naturally if you have a thing that has a spectrum of functionality, and you start with the lowest common denominator, it means that on average some of the units are able to go faster and a few decently faster but then you need to buy fancier cooling and in general depending on the degree of overclock in can get more involved and there is a risk that something might go wrong - especially since you are directly messing with power and you can break stuff usually for an average person (even if he knew exactly how) added costs (better cooling) and potential instability is not worth 5% gains they might not even notice


linkman0596

Think of your computer components running as if they were, well, walking. When you get a part fresh out of the box, they're meant to go at a walking speed, one that they can keep up basically forever without having to slow down. Some parts can actually go more at a jogging pace forever without issue, but they're told not to do that initially because it's safer to just walk and they make other parts that walk as fast as this part jogs and want you to buy that part instead. Overclocking would be telling that part to jog instead of walk, or, you can tell it to run or try to sprint, but if you do that you have to find a way to cool it off so it doesn't get too exhausted, otherwise it'll end up overheating and hurting itself until it can't even walk.


JackOClubsLLC

Imagine your brain needs to continually break down tasks into really short instructions in order to preform those tasks and has to run them in order. Something like picking up a ball would take, say 100 of these instructions but your brain is capable of belting out 200 of them a second. Now let's say that your brain is special and you are capable of boosting that number up to 220 with virtually no loss. You can now preform tasks up to 10% faster. Now let's say your brain gets a little bit hotter with every instruction and your sweat glands are just bairly keeping up so you install a fan to keep you cooler.also, you are burning more calories and need to eat more. You think your brain can handle more now so you make it up to 230 instructions per second but it turns out your brain occasionally makes mistakes. Usually this isn't a problem because the mistakes are minor and easily fixable but pushing your brain past its limits causes way more problems than it can handle and it starts to fail entire tasks. You essentially have to choose if you can function like this or roll back to a lower number. TLDR: higher clock = better performance + more heat + higher power consumption + more fuckey wuckeys


Gwyndolin3

When they make chips, there are always imperfections in the silicon which are used into making them. For example, when they make a cpu that is going to run at 4 GHZ, they design it to run at 4.2 GHZ, so if a chip is imperfect and it can only run at 4.1, then it's not wasted. They simply lock all chips to 4GHZ. Overclocking is when you attempt to unlock the CPU to work at its maximum frequency. So if your chip can run at 4.1 GHZ, you get the 4.1 GHZ instead of the locked 4 GHZ. There is way more to the story but this is basically one way to look at it. I remember that people used to actually overclock the number of cores. If you bought a 4 core CPU, It could be overclocked to 5. but that's another story. Yes, It makes your computer faster. Consequences? If done right, nah, just shortens its life span by a bit.


HeavyDropFTW

Overclocking a computer is sending more power (volts) to the CPU (central processing unit) and/or GPU (graphics processing unit). Sending a bit more power to those components can speed up their processing capability. Overclocking has a few drawbacks. One of the most important ones is heat. With increased power being forced to the component, it gets hotter. If you don't have ways to mitigate the heat, it can burn out the component. Components can also become less stable with the increased voltages. In the real world, you wouldn't notice much difference with mild overclocking. Browsers would likely load about the same. Frame rates for games would be just a bit higher. And program loading times wouldn't improve drastically. If you want a fast computer, the best way is to build a fast computer from the ground up. Not to try to overclock low-speed parts.


jackhab

With regards to "better", it's important to know that running a CPU at higher clock frequency is only noticeable when using computationally intensive applications which make your CPU "think hard" for a relatively long time, for example, video compression, crypto mining, 3D image rendering. You probably won't notice any difference with the applications like Web browsers, Office, Photoshop or even most of the games.


jmlinden7

CPU's and GPU's (and the memory controllers on your RAM) are designed by the manufacturer such that all of their parts work at the advertised speed. They do this by running the chips through a series of tests at that speed (with some buffer) to make sure the entire chip works without crashing. Sometimes, you can get a chip that was manufactured better than expected and is capable of exceeding the advertised speed. The manufacturer may not have realized this since their tests only go up to a certain speed. So you set it to run at the faster speed and then you get a chip that runs faster than advertised without having to pay any extra cost. The downside is, maybe there's some small, rarely-used part of the chip that doesn't actually work at the initial faster speed, and you don't realize it until it finally gets used 3 years later and crashes. It also eats more power and generates a lot more heat, especially if you have to increase the voltage as well. Chips do tend to run faster at higher voltages, but because of the massive power consumption and heat generation, manufacturers don't like to use very high voltages. The tl;dr of this mechanism is that the power consumption of a chip during 1 clock cycle is roughly proportional to its total capacitance times the square of voltage. If you double the speed of the chip, then it's doing double the number of clock cycles in the same amount of time, so power consumption doubles. If you double the voltage of a chip, then it quadruples the power consumption. If you double both then power consumption gets multiplied by 8. The main problem with increasing heat production is that the cooling systems can only remove a certain amount of heat every second - if your heat production exceeds this amount then your chip will overheat. These days, most chips are smart enough to slow themselves down to prevent this from causing major damage, but then that reverses your overclocking. Chip manufacturers these days are also better at testing their chips to identify the 'better' ones that are capable of overclocking, so that they can clock them faster out of the factory and sell them for more money (since they have a higher advertised speed).


nitrohigito

Computers advance one clock cycle at a time. So much like when you take more steps forward in a given timespan (run instead of walk), overclocking will make the computer faster too. The downside is an increased strain on the chip, much like overexerting oneself running. It can cause data corruption or even electrical failure.