SMASHINGMAGAZINE.COM
Why Optimizing Your Lighthouse Score Is Not Enough For A Fast Website
This article is a sponsored by DebugBearWeve all had that moment. Youre optimizing the performance of some website, scrutinizing every millisecond it takes for the current page to load. Youve fired up Google Lighthouse from Chromes DevTools because everyone and their uncle uses it to evaluate performance.After running your 151st report and completing all of the recommended improvements, you experience nirvana: a perfect 100% performance score!Time to pat yourself on the back for a job well done. Maybe you can use this to get that pay raise youve been wanting! Except, dont at least not using Google Lighthouse as your sole proof. I know a perfect score produces all kinds of good feelings. Thats what were aiming for, after all!Google Lighthouse is merely one tool in a complete performance toolkit. What its not is a complete picture of how your website performs in the real world. Sure, we can glean plenty of insights about a sites performance and even spot issues that ought to be addressed to speed things up. But again, its an incomplete picture.What Google Lighthouse Is Great AtI hear other developers boasting about perfect Lighthouse scores and see the screenshots published all over socials. Hey, I just did that myself in the introduction of this article!Lighthouse might be the most widely used web performance reporting tool. Id wager its ubiquity is due to convenience more than the quality of its reports.Open DevTools, click the Lighthouse tab, and generate the report! There are even many ways we can configure Lighthouse to measure performance in simulated situations, such as slow internet connection speeds or creating separate reports for mobile and desktop. Its a very powerful tool for something that comes baked into a free browser. Its also baked right into Googles PageSpeed Insights tool!And its fast. Run a report in Lighthouse, and youll get something back in about 10-15 seconds. Try running reports with other tools, and youll find yourself refilling your coffee, hitting the bathroom, and maybe checking your email (in varying order) while waiting for the results. Theres a good reason for that, but all I want to call out is that Google Lighthouse is lightning fast as far as performance reporting goes.To recap: Lighthouse is great at many things!Its convenient to access,It provides a good deal of configuration for different levels of troubleshooting,And it spits out reports in record time.And what about that bright and lovely animated green score who doesnt love that?!OK, thats the rosy side of Lighthouse reports. Its only fair to highlight its limitations as well. This isnt to dissuade you or anyone else from using Lighthouse, but more of a heads-up that your score may not perfectly reflect reality or even match the scores youd get in other tools, including Googles own PageSpeed Insights.It Doesnt Match Real UsersNot all data is created equal in capital Web Performance. Its important to know this because data represents assumptions that reporting tools make when evaluating performance metrics.The data Lighthouse relies on for its reporting is called simulated data. You might already have a solid guess at what that means: its synthetic data. Now, before kicking simulated data in the knees for not being real data, know that its the reason Lighthouse is super fast.You know how theres a setting to throttle the internet connection speed? That simulates different conditions that either slow down or speed up the connection speed, something that you configure directly in Lighthouse. By default, Lighthouse collects data on a fast connection, but we can configure it to something slower to gain insights on slow page loads. But beware! Lighthouse then estimates how quickly the page would have loaded on a different connection.DebugBear founder Matt Zeunert outlines how data runs in a simulated throttling environment, explaining how Lighthouse uses optimistic and pessimistic averages for making conclusions:[Simulated throttling] reduces variability between tests. But if theres a single slow render-blocking request that shares an origin with several fast responses, then Lighthouse will underestimate page load time.Lighthouse averages optimistic and pessimistic estimates when its unsure exactly which nodes block rendering. In practice, metrics may be closer to either one of these, depending on which dependency graph is more correct.And again, the environment is a configuration, not reality. Its unlikely that your throttled conditions match the connection speeds of an average real user on the website, as they may have a faster network connection or run on a slower CPU. What Lighthouse provides is more like on-demand testing thats immediately available.That makes simulated data great for running tests quickly and under certain artificially sweetened conditions. However, it sacrifices accuracy by making assumptions about the connection speeds of site visitors and averages things in a way that divorces it from reality.While simulated throttling is the default in Lighthouse, it also supports more realistic throttling methods. Running those tests will take more time but give you more accurate data. The easiest way to run Lighthouse with more realistic settings is using an online tool like the DebugBear website speed test or WebPageTest.It Doesnt Impact Core Web Vitals ScoresThese Core Web Vitals everyone talks about are Googles standard metrics for measuring performance. They go beyond simple Your page loaded in X seconds reports by looking at a slew of more pertinent details that are diagnostic of how the page loads, resources that might be blocking other resources, slow user interactions, and how much the page shifts around from loading resources and content. Zeunert has another great post here on Smashing Magazine that discusses each metric in detail.The main point here is that the simulated data Lighthouse produces may (and often does) differ from performance metrics from other tools. I spent a good deal explaining this in another article. The gist of it is that Lighthouse scores do not impact Core Web Vitals data. The reason for that is Core Web Vitals relies on data about real users pulled from the monthly-updated Chrome User Experience (CrUX) report. While CrUX data may be limited by how recently the data was pulled, it is a more accurate reflection of user behaviors and browsing conditions than the simulated data in Lighthouse.The ultimate point Im getting at is that Lighthouse is simply ineffective at measuring Core Web Vitals performance metrics. Heres how I explain it in my bespoke article:[Synthetic] data is fundamentally limited by the fact that it only looks at a single experience in a pre-defined environment. This environment often doesnt even match the average real user on the website, who may have a faster network connection or a slower CPU.I emphasized the important part. In real life, users are likely to have more than one experience on a particular page. Its not as though you navigate to a site, let it load, sit there, and then close the page; youre more likely to do something on that page. And for a Core Web Vital metric that looks for slow paint in response to user input namely, Interaction to Next Paint (INP) theres no way for Lighthouse to measure that at all!Its the same deal for a metric like Cumulative Layout Shift (CLS) that measures the visible stability of a page layout because layout shifts often happen lower on the page after a user has scrolled down. If Lighthouse relied on CrUX data (which it doesnt), then it would be able to make assumptions based on real users who interact with the page and can experience CLS. Instead, Lighthouse waits patiently for the full page load and never interacts with parts of the page, thus having no way of knowing anything about CLS.But Its Still a Good StartThats what I want you to walk away with at the end of the day. A Lighthouse report is incredibly good at producing reports quickly, thanks to the simulated data it uses. In that sense, Id say that Lighthouse is a handy gut check and maybe even a first step to identifying opportunities to optimize performance.But a complete picture, its not. For that, what wed want is a tool that leans on real user data. Tools that integrate CrUX data are pretty good there. But again, that data is pulled every month (28 days to be exact) so it may not reflect the most recent user behaviors and interactions, although it is updated daily on a rolling basis and it is indeed possible to query historical records for larger sample sizes.Even better is using a tool that monitors users in real-time.Data pulled directly from the site of origin is truly the gold standard data we want because it comes from the source of truth. That makes tools that integrate with your site the best way to gain insights and diagnose issues because they tell you exactly how your visitors are experiencing your site.Ive written about using the Performance API in JavaScript to evaluate custom and Core Web Vitals metrics, so its possible to roll that on your own. But there are plenty of existing services out there that do this for you, complete with visualizations, historical records, and true real-time user monitoring (often abbreviated as RUM). What services? Well, DebugBear is a great place to start. I cited Matt Zeunert earlier, and DebugBear is his product.So, if what you want is a complete picture of your sites performance, go ahead and start with Lighthouse. But dont stop there because youre only seeing part of the picture. Youll want to augment your findings and diagnose performance with real-user monitoring for the most complete, accurate picture.
0 Commentarios 0 Acciones 57 Views