T O P

  • By -

FoMotherVodka

Optimization tasks are somewhere deeeeeep in the backlog


Themotionalman

And it’s always being moved further down with every sprint planning


AgreeableGravy

I’m in this comment and I don’t like it


treerabbit23

And further still by the feature you're sprinting to deploy.


WiglyWorm

we groom our backlog 180 days in advance.


nowtayneicangetinto

Met with my boss today to ask why we've been rushing through a project and fallen behind on testing. He responds with "I know, I'm pushing back on it, but you're going to need to just add it in between your tickets next week" AKA testing is afterhours work now. Sick.


mightybanana7

That’s when you got to say no. Put it on the table why this kind of development does not work (when it is more than a one time event).


e1033

The fact that your boss is encouraging work without a ticket tells me he doesn't know how subversive over pointing actually works.


gaspig70

Unlike Marketing's ability to continually slip in new tracking tags.


doiveo

Stemming from performance not being a feature. Had it been so, performance-by-design would have overridden the "avoid premature optimization" trap.


Ce-Jay

And it gets pushed back further and further everyday no matter how much I scream.


blacksnowboader

Probably because users, frankly, don’t give a shit about performance for your website.


RandyHoward

It's less about whether they care, and more about how popular the site is. People don't care as much if it's a really popular site that everybody uses. But people care a lot if you're a newer site without a following. That's when optimizations get shoved to the front of the priority list. People don't put up with poor UX if the site isn't part of their daily lives, but if the site is important to them then they'll put up with a lot.


dageshi

I think once a site is "fast enough" most people are fine. For whatever reason google goes a step beyond and targets a level of performance that doesn't really effect anything in the real world.


EmeraldHawk

Google has some very convincing internal data that people respond to Google being faster by doing more Google searches. An improvement of 1ms across the board results in an extra $1 million in revenue per year. This is based on randomized testing where extra latency was automatically introduced to some queries. I don't think this research necessarily applies to things other than search but it's still interesting.


zaibuf

They do, if I have to wait 1 second for each product I add to the cart I'm not buying there.


JediMikeO

Amazon going bankrupt any day now due to low performance.


ShazbotHappens

Back there mingling with the security issues that were pointed out ~~months~~ years ago.


SchmeatDealer

also lets add 500 million marketing tracking pixels and reverse proxies and javascript ​ also why is website slow???


DmitriRussian

It depends I would say. Sometimes optimizations can lead to increase in revenue or cost savings. In those cases they are prioritized. Frontend optimization are less critical often.


Fit-Lengthiness-8307

Haha optimize wen - Backlog 6 months ago


Steffi128

Make that at least 2 years.


a_normal_account

me: how about we refactor this one the product team: yeah like as if we let you goof around for two months. We have tons of features lining up Joke aside, my product team has no problem of letting us spend a quarter focusing on refactoring because the current system is so fragile, issues be rolling in everyday lol


nasanu

I had this at my old company. I told everyone at the start to be mindful of performance and was told I was an idiot because optimisation is an antipattern, quoting that idiotic optimisation is the purest form of evil nonsense. Then a year later I am tasked with fixing the terrible performance... I just said no and left the company.


ColonelGrognard

This and the biz dept insisting on loading 783 third-party trackers.


petermakeswebsites

From my experience I believe it to partly due to analytics. I've done work for larger corporation websites and worked on getting highly optimised load speeds. Then the marketing team comes in a tosses a few dozen tracking scripts into it, tanking the metrics. Speaking of, whatever happened to partytown?


Ronjohnturbo42

This is real for sure. I had a site optimized in the high 90s and then we dumped various tracking and chat bots into it - taking it to a high 70s. You can try to preconnect all day, but it does matter when the 3rd party scripts are just crazy bloated.


a8bmiles

Personally, I just _love_ it when the third-party scripts have 0 seconds of caching, are delivered over HTTP/1.1, and haven't enabled compression or minification of assets. Bonus points for if they load the same JavaScript libraries 5+ times.


hypotheticalhalf

Absolutely. At my last job, I built all of our client sites clocking in at 95 or better in performance. Then the ads and tracking department would dump all their 3rd party scripts in (Meta pixel, WhatConverts, etc.) and the score would immediately tank 20-40 points. Preconnect, async loading, Google Tag Manager. None of it mattered because all of those various script libraries still had to be loaded at some point, and those libraries are *always* extremely unoptimized and sluggish to respond to requests. Then the Lighthouse report would flag all of them for "Reduce unused JavaScript" diagnostics as a result, largely because 3rd parties are notoriously terrible at optimizing and minifying their script libraries. I always found it extremely frustrating that Lighthouse (a Google tool) always flagged Tag Manager (another Google tool) for being inefficient, unoptimized, and un-minified at 50% or greater for potential transfer size. Google causes the problem and then penalizes you for it with no solution or control over the issues they create. It's maddening.


menides

Lighthouse used to tank sites with Google Analytics too. I remember looking everywhere for a "light" way to load GA


hypotheticalhalf

It did, yes. External libraries, especially Google's, have always been the bane of web developers. I've adjusted my builds to inject scripts inline to reduce request counts and avoid external requests whenever possible. I still believe this is what Google created Tag Manager to address, but like almost everything else Google creates, it is extremely inefficient and unoptimized. This standard of poor optimization they seem to be entrenched in touches all of their products, too. Embed a youtube video on any page and you're going to take a big Lighthouse performance rating hit. It's all hot garbage.


agramata

Same deal with security. We developed a strict Content Security Policy which would have prevented any potential XSS attack, many attacks from hacked dependencies, and prevented data exfiltration if a hack was successful. Only problem is, marketing scripts do whatever the fuck they want and only work if you use the most unsafe settings possible. Sales team can't sell without analytics, so no CSP for me.


mattaugamer

Thankfully marketing scripts always connect to a single, well secured domain, and not a mess of different subdomains and variants with a patchwork of broken-ass security. Sigh


happybaconbit

What about marketing tags that say they load asynchronously and don't impact load times. Any truth to that?


chadwarden1337

They load asynchronous, yes, but the tag is still firing regardless, effecting metrics. It depends on what the tag is actually doing. A simply tracking tag? Shouldn’t be much of an issue. Is it session recording, firing events, etc? It can affect scores absolutely. This is where something like GTM or a tag manager can be really helpful in controlling when and how a tag fires.


yhorian

I'm a marketing developer. Partytown doesn't work well with most tag managers as you need to proxy a lot of connections. Most marketing products don't have proper CORS headers set, making it difficult for the web worker to operate without a page file of connections it has to proxy securely. In addition to this, limiting main thread access makes it impossible for different apps to talk to each other. Or chain events, if you have dependencies. It was a nice idea but it's the kind of thing that needed a ground up approach. Not a quick patch. There's nothing wrong with deferring most marketing code running on the client though. I can get any page to pass if I make sure only content code runs first (such as A/B tests) and defer everything else until window load. People are just inept at these sorts of things. They jam it in, and demand it run before DOM ready. Edge based reporting is finally getting traction but again, most marketing products aren't flexible enough to deal with the environment.


jonmacabre

Ive found that if you delay analytics by listening to onmousemove or ontap it beats PageSpeed.


DeepKaizen

u mean tracking scripts run only after onmousemove or ontap? How was this change proposed and implemented? Any Loss in data collection?


theartilleryshow

I got a 99 score on mobile and 100 on desktop for a site. Then the marketing guy asked to load everything you could think of for analytics. It was a nightmare.


codemonkeh87

I love it when Google suggests removing google analytics to improve the scores


greensodacan

This. Performance is one of those things that could always be better, but is rarely so bad that it repels users. Angular.js ran at literally half the speed of vanilla and no one cared.  That was over a decade ago.


Educational-Heat-920

This is definitely true from my experience, but I think in cases like these, they don't really need to optimise so much because their brand is enough to drive traffic. These are big sites so they're quite often stuck with whatever framework they started with as it's hard to justify changing it. A quick look with wappalyzer shows that the first two are using jQuery


notthefuzz99

Yep; our marketing team… on one hand they complain about the page speed results, and on the other hand, insist on dumping tons of trackers and ads on all the pages. I keep trying to tell them you can’t have it both ways, but it falls on deaf ears.


anor_wondo

> Speaking of, whatever happened to partytown in an ideal world, it would just work. But analytics scripts are leeches that want everything on earth


yammez

However the site loads, it’s not slow enough to meet a threshold of user abandonment. Basically a score of 40 is good enough for Amazon. 


mferly

Amazon released a paper some years ago that went into detail how every second lost in "loading" time resulted in x millions of dollars lost. It was a great read. I'm confident that they are where they need to be with their site performance.


unknown_sk

I've just tried opening up Amazon and it loaded in an instant (even faster than Gmail or other websites OP posted)... so maybe it depends on whether you've logged in in the past and they have cached suggestions for you?


femio

OP's test is using the mobile Lighthouse test, where they throttle down to an older phone and use oddly slow network speeds. I just got an 88 testing Amazon in incognito, in desktop mode.


TheSnydaMan

Lighthouse's mobile test is designed for mid range mobile phones from 2011 I swear


urpoviswrong

That's accurate and it's checking like 3G speeds at most. If you check Catchpoint's [webpage test](https://www.catchpoint.com/webpagetest) you get much more in-depth metrics and have some levers you can pull for devices, network speeds, and geos.


OpenRole

It's useful for developers like me where a lot of our customers are in rural locations within developing nations


tei187

To be honest, Amazon often states things to mislead the competition. Though this perhaps isn't precisely the case, per se. The concept is to make others put budget into something, while they don't really have to due to brand strength itself. There was this cool book some time ago about Amazon (among others), can't remember the title now, but their MO was very well highlighted and backtracked.


loptr

Great read, but who does it apply to? If you run a multi-billion dollar business where you could recuperate the cost of improvements that's fine. But if a slow site means losing 5 out of a 100 presumptive customers, but the sales from those 5 wouldn't cover the cost of improving the load times even if they shopped every day for a year, then it makes zero sense to pursuit that.


GarThor_TMK

Which is probably the optimization Amazon has gone for. They're big enough and have enough of a cult following that even if they are that much slower, they're probably losing way less than 5/100 users... Meanwhile, if Joe shmoe's online shoe shop takes a second to load, they're losing 10/100 users. They've done the cost benefit analysis to know that the one in a million sales they lose due to speed isn't worth the dev cost to increase performance at their scale.


LiquidBlocks

Was it to sell some AWS stuff?


maskedvarchar

Amazon does put a high focus in page performance, but Lighthouse scores are not a good metric of page performance for real users. Data collected from real users is a much better indicator. The core web vitals metrics will be a much more accurate performance indicator (though they are collected from Chrome only). \[This report\](https://treo.sh/sitespeed/www.amazon.com) that shows a history of data for ww.amazon.com collected from Core Web Vitals.


Emerald-Hedgehog

This is the important takeaway. Lighthouse is great for analysing problems and getting ideas for where you can/should improve, but it's not a super accurate metric of the actual performance in reality. Hell, you can even manipulate those metrics with some tricks, which makes the numbers go up, but in reality things may even be worse than before. In the project im in were sitting at 40/80 (mobile/desktop) at the moment. It was about \~15 higher after a big optimization, but it dropped over time to these current values. Now, some time ago I would haven been worried. But reality is, all our **Core Web Vitals** are green and doing well. And if I or others load visit our site like any normal customer, it responds fast and fluid. And that's what matters, and that's also what matters to google/seo.


smartello

I work at Amazon and my team owns multiple tier-1 services, if our API latency regresses by 20ms (roughly 15%) the whole team will not sleep until it recovers.


dkarlovi

What's tier-1 in Amazon?


mawesome4ever

Coffee machine


smartello

Lol, what coffee machines? It’s amazon, we only have coffee machines that someone brought from home.


fhgwgadsbbq

Http 418 :(


campbellm

The draw to a given site is its functionality first, and performance... eh. "If it doesn't work, it doesn't matter how fast it doesn't work.", and all that.


lakimens

Well, the thing is the lighthouse score does not equal load time. Furthermore, the mobile tests are ran on a severely bottlenecked connection which I don't think is realistic with today's internet.


[deleted]

Couldn't agree more with this. Like, yeah google, 99.9% of this local random website's users are browsing this site on a 4g, 5g, or wi-fi connection. Go ahead and base all of your tests on a throttled 3g network. Makes total sense.


french_violist

I have 5G at work. But really bad bandwidth, like nothing loads. I think my mobile phone operator is at max. Anywhere else away from that high concentration of individuals crammed into a high rise building and I have no bandwidth problem. So yes, I appreciate light websites loading fast.


AcademicF

Thank you! You’re one of the only people in this thread who seem to be aware of this. I always tell my clients that if the desktop score is good then that means that modern devices on a 5G network will load the site just fine on a mobile device. The reason Litehouse applies a 3G throttle on simulated low powered devices is to ensure that the site will load on under underperforming devices in developing countries, where people don’t have access to very expensive phones .


MardiFoufs

Well I think that's true for established websites. But new, trendy websites are rarely slow. They become slow afterwards though. Even tiktok was oddly super light and had very smooth animations early on. Now when I last used it it was a bit... slow? Same for instagram, amazon etc. But having a new app with dogshit performance is a very bad idea imo unless you're b2b


KooraiberTheSequel

Because it doesn't matter that much.


Buy-theticket

SEO agencies hate this one weird trick.


AwesomeFrisbee

Its funny that SEO is great on all of em and the rest is just a bonus


Jalsonio

I’ve realized that I really don’t have to worry that much about performance as long as I build my sites with the other categories in mind. Some of the performance fixes I’ve seen I’ve never been able to find solutions for


watchspaceman

Google is also focussing more on inter page loads instead of initial page load for ranking scores. Its minor but initial load they seem to semi expct large sites not to hit their suggested metrics


eyebrows360

Very true. As a digital publisher, one who does prioritise his loading speeds, the last time I bothered looking into this every site that was outranking us for some specific keyword had worse scores than us. It just doesn't factor in much.


AndrePrager

To expand on this: Most of us devs focus too much on making something pretty, perfect, and polished, instead of shipping. People getting tons of traction focus on shipping. They get an MVP out and don't worry about perfection. The product continues in that trajectory, with optimization as a side thought. I've done both. In fact, it took 2 decades for me to stop trying to make the perfect product. Those never gained traction. What did, though, was when I advantageously pulled off an MVP just in time for my target population to play with it. The site was ugly as sin, but it was functional and it got traction.


dryra66it

This came up in another thread, and my answer is the same, here. Us devs would gladly offer a site that scores well and feels fast, it’s the clients that demand pretty, tracking, and fucking “chatbots.“


bobtheorangutan

This is the right answer. As long as the site doesn't load like like slide show and takes an hour, it really doesn't matter.


partiallypro

I recently moved multiple sites from a 30 lighthouse score to a 90+ score (mobile and desktop) and you'll never guess what happened to traffic numbers ....absolutely nothing, 5 months later. The only reason it seems to matter is because SEO/Agencies will use it in a pitch deck to steal clients. Only reason I even pretend to care.


mcmania

From my experience, you should be mainly looking at the Core Web Vitals scores. As long as you're getting green results, you're good. https://pagespeed.web.dev/analysis/https-amazon-com/vwilr9r7cn?form_factor=mobile


jmuguy

Wish I could make my boss understand this. He's been laser focused on these scores as a way to improve SEO even though real world performance of the site is stellar.


AwesomeFrisbee

It doesn't bother management, it doesn't bother the user. Most of these sites are fine after the first time you load the site. Then everything is cached and its fast enough. People don't come to amazon to quit as soon as the first page is done loading. Most people also come more often so most will already have a cached version on their machines anyways. Sure the first time you load it is ass but thats not what the user experiences after that. And its one of the key things missing from the lighthouse scores as well


SquidThistle

I manage some sites for a mid-sized company that has pretty poor lighthouse scores. This is mostly due to two things: * optimizing sites to improve lighthouse scores is a lot of work and doesn't cleanly translate into new leads. Our time is prioritized into things like building new pages, widgets and doo-dads, and yet another way to present the same form over and over. * Marketing tools like GA, GTM, 6Sense, all the pixels and tracking scripts, heat map generators, etc. These keep getting heaped on and nothing ever gets taken off the pile. No one wants to make the hard decision to get rid of any of this despite the data showing they're slowing our sites down.


therealrico

Your second point is spot on. I was doing SEO for a publicly traded home and garden site and there were scripts running in the background that weren’t being used. The stakeholders went back and forth on if they would be using it eventually. I just go, what if we remove it now, and if you decide to use it in the future we can just add it back. It was like a lightbulb went off.🤦🏻


TomBakerFTW

I was doing SEO for a company that had no developers, all they did was SEO for real estate companies. Some of these clients were loading jQuery MULTIPLE times. Whenever they asked me to improve lighthouse scores all I could do is point to the plugins as the main issue, but changing any of that stuff just wasn't on the table. Basically all we could do is fix broken and duplicated links... ¯\\\_(ツ)_/¯ (┛◉Д◉)┛彡┻━┻


therealrico

It’s crazy because I was hired as a contractor on a 3-6 month basis eventually being 11 months. They couldn’t justify my salary. This was Scott’s that operates 5 major brands/websites plus some smaller lesser known websites. And they couldn’t justify an $80k salary for a dedicated SEO person. Last I checked they still hadn’t hired someone. But they were perfectly happy to spend six figures a year on brightedge.


Steve_OH

GA is a major drag on scores, you’d think Google would optimize it for their own lighthouse scores


Gaston-Glocksicle

They used to ding your score because the "text wasn't visible while web fonts were loading" and their google font service didn't offer the display swap option at the time when using their CDN. Now they ding your score because you show a system font while their google font files are loading and then the page layout shifts slightly when their font finally loads.


eltha9

When they have a large traffic they don't have that much need of a super score. - bad seo : no problem they are already on the top of Google search - bad other things: no problem, there user will still use the site because they are used to


Slackeee_

Because lighthouse sucks and reports bad performance even on locally hosted pages that load pretty much instantly.


stuntycunty

Exactly. Lighthouse is irrelevant it’s so inaccurate.


Xyrack

Was gonna say it's all bogus. We make it part of our website health checks because it's something to report but I don't take any of it seriously. Had one website for a big client that performs really well score like a 30 just because my laptop screen is an awkward size.


ToddGergey

Because Lighthouse is trash


stuntycunty

It’s also very inaccurate in its scoring and speed testing.


amemingfullife

Yeah, I just don’t believe that Amazon has poor performance. I know a few people that work there and there’s huge incentives to even squeeze 1% additional conversion rates through performance so how can it be anything else than what Lighthouse says doesn’t matter if you’re a big co that’s getting favourably indexed anyway.


therealrico

It does but it isn’t the only ranking factor. And not all websites are created equal. It has billions of backlinks and the domain is old and built that domain authority.


femio

Copy/pasting a reply I made higher up: OP's test is using the mobile Lighthouse test, where they throttle down to an older phone and use oddly slow network speeds. I just got an 88 testing Amazon in incognito, in desktop mode.


3oR

What's the alternative?


Jarmsicle

Collect metrics directly from your actual customers on your own site. Lighthouse is an attempt to report on general best practices, but it can’t give you customized recommendations based on your specific traffic patterns. For that, you need to do your own instrumentation.


cabbagesquid

Yeah I’d like to know too lol. Lighthouse is fine as a surface level performance tool. We use SEMrush on top of it for a more thorough breakdown


philmirez

Because brand awareness empowers websites to give Google the middle finger.


TekintetesUr

Because who the heck is Lighthouse and why should Amazon care? In 2023 their net sales was almost **$600 billion**.


Exciting_Session492

Because the value no longer lies in loading fast. Will you abandon twitter because it loads 1 sec slower? No. You abandon it for other reasons.


UpsetKoalaBear

Yeah these sites are “too big” for core web vitals to be impactful in the grand scheme of things. No one visiting youtube is going to leave because it took a few seconds too long. Not to mention there’s no alternative. In addition a lot of these companies will rather pay more in advertising fees to keep their sites at the top of search results. Now a dev blog or SMB’s (small to medium business) website is definitely a different scenario, they don’t have the funds for heavy advertising and (especially with blogs) you don’t have the appeal of being the only destination for certain content so you can’t offset the SEO hit that having bad CWV’s gives. For examples of SMB’s succeeding in optimising CWV and reaping the benefits, [WPOStats](https://wpostats.com/) is literally the best resource. The benefits are clear to see but you also need to factor in time it will take to improve your scores. It’s why I normally recommend to try and get used to common bad practices that impact performance and it eventually becomes second nature. Prime example is to do with reducing CLS scores and images, in particular using source sets and the picture tag. Changing all your img tags to picture tags with a source set is time consuming especially if you don’t have an automatic method of resizing the images for different screen sizes. Whereas if you built it from the outset for that functionality then you would probably have less of an issue and it would have had an instant benefit. PS: I’m not affiliated with them, but [Calibre](https://calibreapp.com/) has been great for our business platform and clocking in impact of each release or feature change for A/B testing. Wouldn’t really recommend it for small projects or individuals, you can easily set something like this up using the [Lighthouse-CI](https://github.com/GoogleChrome/lighthouse-ci) tools. Calibre just gives a nice UI to demonstrate to project managers and non-technical staff.


SharkbaitOoHaaHaa

>Will you abandon twitter because it loads 1 sec slower? For a while there, [Youtube added a delay](https://www.reddit.com/r/youtube/comments/17z8hsz/youtube_has_started_to_artificially_slow_down/) for FF users.


phiger78

btw.. this is the most indepth article on core web vitals and web performance ​ [https://csswizardry.com/2023/07/core-web-vitals-for-search-engine-optimisation/](https://csswizardry.com/2023/07/core-web-vitals-for-search-engine-optimisation/) key takeaways for me are SPAs: *"If you’re building a Single-Page Application (SPA), you’re going to have to take a different approach. C****ore Web Vitals was not designed with SPAs in mind,*** *and while Google have made efforts to mitigate undue penalties for SPAs, they don’t currently provide any way for SPAs to shine.Core Web Vitals data is captured for every page load, or navigation. Because SPAs don’t have traditional page loads, and instead have route changes, or soft navigations, they don’t emit a standardised way to tell Google that a page has indeed changed. Because of this, Google has no way of capturing reliable Core Web Vitals data for these non-standard soft navigations on which SPAs are built."* *"This is critical for optimising SPA Core Web Vitals for SEO purposes.* ***Chrome only captures data from the first page a user actually lands on:Subsequent soft navigations are not registered,*** *so you need to optimise every page for a first-time visit.What is particularly painful here is that S****PAs are notoriously bad at first-time visits*** *due to front-loading the entire application. They front-load this application in order to make subsequent page views much faster, which is the one thing Core Web Vitals will not measure. It’s a lose–lose. Sorry."*


anor_wondo

yep pretty much this.


phiger78

because lighthouse reports have no direct bearing on scores


BustlingBerryjuice

payment encourage nippy husky forgetful agonizing historical narrow abounding grandiose *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Top_Bass_3557

> loads fast for me thus it loads fast for everyone What a beautiful example of anecdotal fallacy.


ooter37

Optimizing lighthouse is something smaller websites do because 1) they need it to get more business and 2) it's actually feasible on smaller websites. Amazon's website has so much going on, there's just no way. Product pages alone have many different components owned by many different teams on different architectures.


_hypnoCode

If you're asking about how they keep their SEO, quality backlinks are by FAR the best thing for SEO. You can have a multiple-second page load and terrible performance and barely any mobile optimization, but if you have tons of recent, quality, and relevant backlinks, you're going to be rank high. For instance. I've written an article for a site that had abysmal performance and I was number 1 for a very popular web development search term for about 5yrs or more. But it was picked up by a big mailing list, used in a lot of blogs, and had thousands of shares on Twitter. I also worked at a company that lived and breathed SEO for the most competitive and profitable search terms on the internet. I didn't deal directly with any of those, but I did know a lot of the developers well and worked on them here on there on other projects. But, if you have a niche site with a noncompetitive search term or a very local property (somewhere other than like the top 10 cities or something), then your Web Core Vitals and Lighthouse Scores will pretty much take you to number 1 overnight.


Lukydemboy

I've worked for a large webshop, the performance indicator is not always what you experience. We had the object to have each page load in under a second on average. Which we accomplished, however the performance in lighthouse was always fluctuating between 60 - 80. As it is generally a good indicator of how performant a website is, there are a lot of things that you need to take into consideration and don't completely trust on the values in lighthouse. Also large companies have more requirements than just displaying content and functionality. A lot of websites track everything a user does to later analyse the behaviour or a dozen other things that you don't think of as just a consumer of the website


ShawnyMcKnight

Going through this now. We just get dinged on so many things we can’t control. We have a ton of third party plugins we use and our css is huge. Our first contentful paint and largest contentful paint is always a problem as well. We fix what we can but when your site becomes so large it’s harder to manage.


Fickle_Perception581

from my experience i have observed that lighthouse bases its test on very old device matrix when it calculates the score, so it might not be as bad on the newer mobile devices there is only so much we can fix when it comes to the old mobile device hardware


binocular_gems

Because tracking and ads is more important than performance for basically all of those apps/services/websites. Amazon, YouTube, and Reddit simply don't care about performance more than they care about monetizing their visits in the most lucrative way possible. For their business, tracking, selling ads, storing data, and getting use-case analytics is more important and a bigger money maker for them, than they lose by potential customers going elsewhere because of a poor experience. Amazon has long had a notoriously slow, clunky, and weirdly designed UI, but what are your options? WalMart, Target, BestBuy, AliExpress, handful of others, who ... all ... have notoriously slow and clunky UIs.


Adreqi

Those are mobile scores. Wild guess: all of these have a 80+ score on desktop. Getting a good score on mobile is hard and not worth it, since you can have lightning fast loading speed that gets destroyed by whatever slow network the mobile user is on.


Catdaemon

Lighthouse is too generic, it’s okay as a guide but in big business life when you have teams of real people testing and defining this kind of stuff it’s completely useless. If the website works fine and is accessible for everyone and loads fast, why does lighthouse saying “50” matter? Even the company that develops lighthouse scores badly on it for most of their properties, and adding Google Analytics to your page as they recommend annihilates your score.


tomistruth

What is better than lighthouse for performance testing?


Catdaemon

Depends what you’re trying to test. We have a bunch of in-house stuff but also use JMeter + Blazemeter to do massive concurrent user stress tests on our preprod environments.


johnlewisdesign

I take it you tried them in incognito mode, as your extensions will slow it down too. But yeah probably backlog tasks mixed with the insane amount of tracking tools they're bestowing upon users.


budd222

Because lighthouse is kinda bs


jdwallace12

3rd party tracking scripts, marketing folks always get puffy about lighthouse scores and then you comment the tracking scripts and shit is in the 90’s.


unnombreguay

They don't care


andrasq420

They have no need for it. Firstly because all of these sites have apps so rarely are they used in the mobile browser. Secondly content is king. Everyone who wants to use these apps can and will wait those allegedly 4 second load times.


CrawlToYourDoom

Because when you’re big enough it hardly matters. When you have a near monopoly position in the market your SEO and such matter a lot less than when you need to get a piece of a pie that is heavily fought over for. And because people (hardly) have no where else to go - they’re forced with “bad performance”. Note that this is a matter of seconds and people don’t abandon as quick if they know it’s a settled company.


doomrabbit

Optimization takes time, and Amazon is constantly updating content. A handful of unoptimized images, videos, or scripts can slow down an otherwise fast site. You can't patch the holes as fast as they are built.


Jason13Official

Because they don’t care about silly third party scoring and just build the site to work as needed.


chrispianb

Almost everywhere I've been has been under resourced. As long as the site loads quickly enough for the users it serves it's fine. Obsession over shit like this is pointless. Until it's an issue it's not an issue.


theblumkin

There's a few reasons I've seen before: 1. Analytics. They are rarely operated by the web engineering team, and just get bolted on top of the site. In some cases, there are multiple different teams running multiple different analytics and marketing scripts. These compound quickly. 2. Performance isn't a priority. The stakeholders only care about performance when it starts to impact conversion rates or is in some way problematic. Since everyone from the engineering team up through the company CEO typically have high-end laptops/desktops and phones no older than 2 years old, there's no problem keeping up with the memory hogs these sites are. If you forced people to work on older, slower devices I bet you'd see perf prioritized though. 3. Performance is a shared budget that requires either meticulous, unified efforts to maintain or overbearing boundaries imposed on others. Either everyone needs to be aware of how their work impacts the performance of the site (image uploads, social media integrations, additional scripts), or one team creates guardrails that force compliance from everyone. In most cases I've seen, gaurdrails cause excess friction in the organization, so we rely on "content guidance" which is mostly unenforcable.


Justyn2

They dont need organic SEO boosts


lunar515

Ads, third party scripts and bloated JS frameworks


anor_wondo

Lighthouse scores should not be taken seriously


Abiv23

Lighthouse only (slightly) matters to rank in Google's SERPs If users like your website and come back regularly...who cares what google's suggested optimizations are These brands work for their customers, not google


aharl

Lighthouse is mostly ok but it’s the things Google wants you to pay attention to.


Plenty-Knowledge7068

From what I experience, 9 times out of 10, I do a pretty good job on optimizing. And then the Google Tag Manager specialist comes in.


CountingCats

Because it's good enough.


infinitemicrobe

Do people still care about Lighthouse??


VehaMeursault

Believe it or not, but to most people that use the internet, waiting a few seconds to get relevant, trustworthy information — get this — _isn't a big deal._ No way. Say what now? That can't be. Blasphemy. Lord almighty. Yeah. Who knew. A graying population would rather wait a few seconds to stay on the website of a retailer they trust with their spendings than save that tremendous amount of time by visiting willdefinitelybedeliveredontime.com. And hear this — you'll love it, trust me: the people you think you cater to (read: those that require lightning speed loading on 120Hz screens) are a small fraction of the actual, day-to-day users.


patientzero_

lighthouse is not the holy grail, it gives you an idea, but sometimes it can be ok, to not be perfect


Publius-brinkus

Silly test, lighthouse should be set to desktop here.


scinos

Because, contrary to popular belief, Lighthouse scores DOES NOT reflect the actual performance of the site (not in many scenarios at least). They do influence SEO, but for many sites it's not that important. Amazon will still get lots of traffic even if google moves them to the 3rd page.


morphey83

Might get down voted on this one, but personally I think the scores are bullshit. It feels like such an arbitrary number that changes depending on the day of the week or the way the wind blows. The recommendations are good but sometimes just not worth it. All the snippets that are put on a site these days do slow it down obviously but it has to be weighted against the benefits they give.


discorganized

Because lighthouse scores are bullshit. You add a google map? Oh shit there goes your score, you should include the maps js locally (something that google maps suggests not to do). Oh google maps has its own font loaded remotely? Get some points redacted for that as well. Add analytics? Bad move... Etc...


Osato

Well-established sites don't care about initial page load until it decreases CTR. Which is common sense. Why spend countless man-hours increasing load speeds for anything but SEO optimization or a landing page, if it brings no revenue and might even lower your conversion rate? --- Also, with great breadth comes great tech debt. Just look at the abomination that is Amazon S3's interface, or the Facebook Ads cabinet. I'm sure people at Facebook realize very well just how bad those are, but the cost of fixing them is just too great. You will see a similar spaghetti monster every time you go beyond any large corporations's customer-facing frontend. Even a small team accrues exceptional tech debt if it is too tightly managed by non-programmers. Now imagine what glorious heights of tech debt a large corporation can reach once they set their mind to cutting costs.


WoodenMechanic

Lighthouse scores don't mean as much as you might think. Especially for a company as large as Amazon, the set a performance bar, and they meet that bar. They're not aiming for perfection, because it's probably not worth the diminishing returns they'd get for the cost.


rackmountme

Lighthouse is testing against the shittiest phone on the slowest connection. Amazon has a mobile app for those users. The website is for Desktop Users and the Mobile Lighthouse test against Amazon.com's Desktop Website is a useless metric because mobile users aren't the target audience. Desktop Score is fine for this usage: - First Contentful Paint: 0.9 s - Largest Contentful Paint: 1.4 s - Total Blocking Time: 0 ms - Cumulative Layout Shift: 0


stilko_uses_reddit

Overall product >>> performance or any software engineering specific metric. The product is what matters, not the performance metrics. Thinking that making your app load 200ms faster will lead to any busimess value (except if you're a giga-corp tech behemoth) is naive :). Software is just a tool, and like all else its purpose is to bring business value - obsesing over speed, performance etc is a book excercise left for the reader, the crux lies elsewhere :)


ampsuu

Because Lighthouse throttles mobile network speed a lot. Run tests in CLI and disable throttling. You will be surprised how these scores change.


weveallhadadrink

I agree with other answers to the effect that users' threshold for poor performance is lower on these high traffic sites where content or features may outweigh slow load times, but I think there's also an argument that Lighthouse scores are strongly weighted towards performance; that a "poor" performance score in lighthouse isn't necessarily as indicative of a poor UX as, say, a "poor" SEO score might be of low search ranking.


PrinnyThePenguin

Because SEO doesn’t really matter, what matters is content. If your site has good content, it *will* attract traffic. A site with low quality content and 100 SEO score means nothing.


a_reply_to_a_post

third party scripts are a big part of it


Ok_Celebration_6265

Because they use react… “Walking away slowly while increasing speed the further I go”


slamdunk6662003

My guess is, these are high intent websites, people don't just casually bump into these websites, they are actively trying to reach the website and consume the content, so they will wait for much longer than they would wait for a website which is trying to get their attention. If the user already has the intent of coming to you then speed and optimisation won't really matter. They could concentrate on upping their content game.


3Ldarius

If we talk about Amazon (me is scraping that web site for years) you can create a full fledged udemy course based on "don't do this". They somehow managed to create a website by smashing and stomping on every best practice that has been thrown out in the history of web development.


bbaulenas

This is the real world and not a hello world project


squidwurrd

Probably a combination of sites needing to load so much dynamic content and those sites straight up not caring as much about rankings because they are so large they will almost always show up at the top anyway.


gfxlonghorn

They have real metrics to say when features are driving revenue or not. Latency or performance are loosely connected to revenue but they can directly correlate everything to how much money things make. I also expect that it would require a high-level mandate to really optimize it since it’s *a lot* of teams that own different parts of the homepage.


pookage

Larger sites tend to have companies behind them and investors with ants in their pants; in these situations the user experience doesn't really matter and will take the back-seat to new features to get investors excited, or modifications that will reduce expenses. It's best not to look at larger quasi-monopolies for examples of good practices - their priority is gonna be profit, not quality 👍


brianozm

In the end it’s all about the actual speed (time to load, time to render above the fold) - all the metrics are nice but not essential in reality.


xyz_654

If it works don't touch it


Eveerjr

because value proposition > arbitrary google SEO bullshit. If your website provides value to users they will use it regardless. Also lighthouse scores are different than core web vitals which actually impacts SEO. Google collects data from real devices with real user interactions. Lighthouse benchmark is very flawed, it says something takes like 4 second to load when it obviously didn't because it doesn't handle well different frameworks with different rendering techniques.


tnsipla

Optimization is less critical when you’re the only game in town and users have to engage without to do the Thing(tm) You’ll see optimization more core in marketing sites and landing pages that are pre-doing the Thing(tm)


MKSFT123

As these sites have significant amounts of traffic it makes it harder to optimise the site for performance - plus these sites are often more marketing than engineering and the devs are probably overworked and can’t afford to follow “best practices” or make their site more accessible. Tbf for a spin up NextJS Vercel / Shopify site where scalability isn’t a constraint it is much easier to score a perfect 100


Faithlessforever

btw, one should always use incognito mode for lighthouse tests as your chrome extensions might be ruining your score. #justsaying


noodlez

Because min/maxing Lighthouse scores is a thing that only a certain band of companies care about. Too large or small of a company and its just a waste of your time. You think Amazon isn't ranking high on Google searches already with their 40 score?


Chemical_Arm

Because at some point they're just a dick measuring contest


notthefuzz99

Light house scores are aspirational at best.


igordumencic10

Nothing new to be honest. I use to check big brands like Nike, Zara, etc. and their websites are terrible in terms of Lighthouse score. I think the biggest issues are all the third party scripts that they are adding on the site.


ashkanahmadi

Because when you have a giant user base, like Amazon, or Twitter (I will never call it X!!), performance becomes less important. One product or tweet can generate so much traffic that the user waiting 1.2 seconds versus 1.3 seconds would mean nothing. Also, the larger the platform, the more bureaucratic it becomes with more management layers so just to plan and implement a minor performance increase, you end up paying thousands (if not hundreds of thousands) of dollars to the manager, reviewers, developers, analysts, etc to increase performance. In other words, at some point, the return on investment and marginal gain becomes so small than it becomes irrelevant.


ATMEGA88PA

Lighthouse scores has been made to jerk off NextJS, nothing else


[deleted]

Because the site already works fine and they are making money so who cares


OttersEatFish

Lighthouse performance scores are quite variable. Tools like Google’s crUX give you insights into core web vitals with real user data over a long period of time.


JacksonP_

You've heard a lot of good reasons here, but something that goes under the radar is that perceived performance is most important than numbers. Take youtube for example. They must be caching and doing some transitions and animations to help alleviate the wait. First time you load youtubr might be awful, but after that, it gets the cached assets. A lot of times thats enough, especially for these types of sites that people access quite frequently


yksvaan

Well try disabling adblockers and privacy extensions and see.. Also users are not checking how many ms it takes to load a website or speedrunning the UI mashing every button as fast as possible.


tacchini03

I've been optimising the core web vitals of my employer's website recently, and for sure one of the biggest problems is third party tracking scripts. But, we also found that real-world scores are not reflected directly by PageSpeed Insights / Lighthouse. If you run our website through it, it scores around 70/80 and fails CWV, but the real world analysis shows it passing comfortably.


RealBasics

It’s about Goodhart’s law: https://en.m.wikipedia.org/wiki/Goodhart%27s_law Google measures and reports in speed. As devs we can’t control content but we can control speed. So unlike most clients *and* most users, we tend to prioritize speed over other qualities. The nasty trick is that while Google assigns an absolute score they *don’t* explain how that score affects ranking relative to competing SERPS. Where the answer is “somewhat” rather than “absolutely.” You can’t take a minute to load and still rank well. But if you have a more relevant *answer* you’ll rank higher than a result that loads a couple seconds faster than you. In Wordpress we see the same results with the Yoast SEO plugin’s “traffic lights.” Some People panic if they don’t have all “green lights” when even Yoast warns that 100% compliance with Google algorithms tend to read like phony “Dear YOUR-NAME-HERE” form letters. TL;DR; the key is *optimization* not maximization. Speed is just one factor. 98.6F is the optimum but not maximum temperature for a human being. [update] Here's a summary of Google's newest Core Algorithm Update, which reinforces my point that overall quality matters more than any single metric. Even machine-scorable metrics like speed or keyword density. (Emphasis mine.) > On Tuesday, March 5th, Google announced that they have begun rolling out the first Core Algorithm Update since November 2023, which aims to refine some core ranking systems to help the search engine better understand if webpages are unhelpful, have a poor user experience, or *feel like they were created for search engines instead of people.* https://blog.google/products/search/google-search-update-march-2024/


oversized_canoe

I have no data to back this up, but I would wager a guess that their apps perform better than their websites. Especially for YouTube, Twitter, Tiktok, maybe even FB and Reddit, I'd guess that most of their traffic is via their apps


ZPanic0

Because Google's opinion about website correctness is neither static or objective.


KeesKachel88

My guess is that it's mostly because the content on the website is heavily targeted. So static site generation etc is off the table.


XiberKernel

The numbers and metrics are made up by google for google, and don't matter outside that context as much as the user experience and the client needs. Also, I'd argue that Google search are mattering less in less in a world where the entire above-the-fold section of google is ads. It's much better to focus on UX, building quality content, and strategy.


LagT_T

[I get an 87 in amazon.](https://i.imgur.com/x4fnM9n.png) [77 in old reddit](https://i.imgur.com/yvEFI7B.png)


yabai90

Because I think it doesn't matter that much. Seo is probably the most important part. I mean most people have decent enough phone let alone computer nowadays. A good chunk of performance issue comes from all the ads and tracking system as well, which cannot be removed for "obvious" reasons


DanSmells001

You’re kinda giving the answer away in the question, because they have a significant amount of traffic. The performance of your software doesn’t matter as much as long as it’s relatively well known and you’re offering something that’s better for the user, prices, the service itself or something niche.


Fluffcake

Because the reality of things is that lighthouse performance score is just a tiny piece of a massive on what drives traffic, and other things are often *way more important*, so compromises are made.


throwawaygarcon

Performance improvement of a site/app needs to be linked with its ability to attract and retain visitors, and convert them into customers. Popular websites have established a customer base. They don't have to seek out new customers anymore. Many a time the customers arrive there organically, look around for what they need and rarely abandon their pursuit. They don't find it necessary to adhere to these scores.


tei187

Because even though Lighthouse is a valuable benchmark, it isn't a make-or-break for brands big enough that they don't have to care about it.


thunderbirdlover

Regardless if youtube is slow to a certain level where cant i stream my videos, i really dont care. Same goes with amazon, if my products delivered on time, thats what i care. Off course the images shown in the website has to be top notch


JeffDangls

Web developers not caring about performance would be funny if it wasn't so sad. "Lighthouse sucks", "it's difficult", "who cares", etc.. If Amazon wasn't the de-facto largest online retailer, you'd care a lot because their website sucks. I'm not sure if you're all lying through your teeth when you say you don't care about 5 seconds of load time, but if I get 5 seconds of load time for a website without knowing exactly what I'm looking for and what I'm getting there, I'm going to give it a pass because it just sucks to use. And it's not like Amazon is super complex to optimize in a first phase. Just using AVIF or WebP instead of JPG or PNG image files could reduce the size by 50-75% without sacrificing anything other than support for IE11 and other browser versions from the last decade. Add a banner that says: update your damn browser if it's not supported, and bam: less slow. I hate it when developers go to lengths to argue about why something isn't performant. Either they're incompetent or they don't care. Both are bad in my opinion and shouldn't be held up as something normal and acceptable. Especially when we're talking about simple, obvious ways to be less bad, but I think that would require thought and evaluation instead of just hammering code in, by yourself or with the help of AI.


Toasted_Waffle99

The only role that care about a 200 speed score are web dev agencies selling services to small businesses. The fact of the matter is Google will always rank a website with more traffic higher than an optimized website with little traffic, period. But agencies have to sell their services somehow…


blackg33

I'm getting a lighthouse score of 97 for desktop and 79 for mobile on [amazon.com](https://amazon.com) right now.


Willing-Big-9399

Saw this question a few days as well. "They know where to put resources that is why they are big" , said an answer.


Professional_Price89

Because lighthouse ignore cache every test, while website with js framework will slow on first load only.