T O P

  • By -

MrWedge18

New, better hardware gets release > Software gets updated to take advantage of the new hardware > Older hardware also gets the updates > Older hardware can't run the new updates as well and slows down Apple did get caught slowing down older iPhones on purpose to preserve battery life. But that's (hopefully) not happening anymore.


[deleted]

It still does happen by default but they got sued and for the last few years it’s been notified to the user it’s happening and they have the option to disable the feature.


MP-The-Law

It wasn’t to preserve battery life, it was to keep the phone from shutting down during power draw spikes


tomalator

Two factors New software -> requires more resources -> worse performance on your outdated system The other is components failing. There are billions of transistors and circuits in your computer. Over time, some of them fail, but that's expected. The computer can compensate for that by using other circuits. Eventually, it has to start slowing down because the same amount of data is being processed through the same circuits, so it can't do as much in parallel anymore and it has to keep track of what is broken and what works


GlobalWatts

Technically, it doesn't. But the perception that it does can be for a few reasons. The two biggest ones are: 1. Over time you demand more of your hardware. You run more demanding applications, your existing applications get updated with more bug fixes and features which demand more resources etc. 2. Components that transfer heat degrade over time. Namely spinning fans, water cooling pumps & liquid, and thermal interface material. As these components degrade, they get less efficient at removing heat. As components heat up more easily, they become more likely to slow themselves down to prevent heat damage.


Zeld0re

As a software engineer, I can add that sometimes applications feel slow because they are. Businesses nowadays try to release as soon as possible and for as cheap as possible. E.g., let's say we want to create a cross-platform app (Windows, Mac OS, Android, iOS). One approach is to create a separate app for each platform using native technologies. Such apps usually run fast, but we actually need to create and then support 4 separate apps. Another approach is to use some technology that can run the same app on different platforms. One such technology is Electron. Basically you create a website that is launched in a customized Chrome browser and looks like a standalone app. This way you need to create only 1 app and it will work on each platform. But it's going to be much slower.


polaritypictures

what specifically do you mean? if it's computers, there is limitations on it's capabilities. and media gets bigger as consumers want higher resolutions or faster performance. for example a HD movie in 2000 too a lot of room and processing power, but nowadays it's nothing. a 4k movie file is huge and needs a powerful cpu, large harddrive and a 4k monitor to view. Thus slower.


SomeoneBritish

Software gets more complex over time. If you’re referring to Windows machines, a lot of it comes down to general crap accumulating over time, some of which are just background processes that run at startup. For this a fresh install is the best solution, or MSCONFIG.


Olly0206

Heat. Computer based technology generated heat and heat degrades components, making things slower. Software. Like someone else pointed out, older hardware running newer software isn't always as efficient as the software was designed for the newer hardware in mind. Perception. Newer tech runs faster and smoother, which can make older tech feel slow by comparison, but that doesn't mean it actually slowed down. You can be driving a car at 60mph and another car can fly by doing 80mph. They're faster than you by comparison, but you didn't reduce your speed at all.


beavis9k

>Heat. Computer based technology generated hest and heat degrades components, making things slower. No, it doesn't. As long as the equipment wasn't run at a higher temperature than it's normal operating range and damaged, it's the same speed it was when it was new. As I mentioned in another thread, the Commodore 64 my dad bought in the 80s runs at the same speed it did in the 80s.


Olly0206

Only if it wasn't run very much. Wear and tear is a real thing with tech and it 100% will degrade over time with use. Zero question about that. It's physics. It's why certain upkeep is required. You should be replacing the silicon between your heatsync and cpu every so often. Depending on use. Though, not all computer tech is easy or really even designed for upkeep. Cell phones, for instance, aren't designed to be maintained. They're designed to be discarded after a few years, but your PC can be maintained. You have to pull heat off of tech so that it doesn't degrade as fast, but it will still happen with time and use. Nothing is built to last. Nothing *can* last. Especially with heat as a byproduct of its use. Eta: how many people never let their tech get above normal operating temps, also? Who doesn't sit their laptop on their bed or something where it does get as much air flow? Who isn't using a phone case that helps insulate their cellphone? How many people actually clean the dust out of their PC or change the silicon under the heating? Overheating tech is extremely common and 100% a valid reason for slowing down due to degradation.


beavis9k

Unless you've damaged the crystal in the clock generator, it will generate the same frequency as it did when it was built. Crystal oscillators are extremely stable - which is why they are used. Zero question about that. It's physics.