With the sudden switch to Lighthouse, the once revered optimization platform becomes unrecognizable and unusable.
Today I went to complete an ongoing optimization project for a client of mine only to discover that the entire GTMetrix platform has been converted to use Lighthouse rather than PageSpeed and YSlow. While I am aware of the “advantages” that Google feels that Lighthouse offers (they think all of their software is the best), the service has never provided useful optimization reports for developers like myself, who are generally tasked with updating a site until it scores well in a trusted rating system. For years, that trusted rating system has been PageSpeed and YSlow through GTMetrix.
I have been an ardent supporter of GTMetrix for years, and have continuously found the reports offered through PageSpeed and YSlow to be vastly superior to the gross generalizations made by Google’s optimization platform. Google’s PageSpeed testing platform for example, pitches the need to switch to their proprietary image extension (webp), presenting this as a major issue with website performance when we all know what it is – another effort by Google to use their purchasing power to force more of the internet to be reliant on their tech. Many browsers don’t even support that image format! After over a decade of working as a developer, I can comfortably say that Google often uses their software to pitch services that they offer at every juncture, even if the service won’t actually serve their customer’s best interest.
Now let’s talk results.
Many sites that objectively perform well, and have had every available effort to optimize the site short of stripping it down to nothing perform terribly on the new GTMetrix rating system. This prevents developers from being able to explain to their clients that their site isn’t actually performing at 30%, that there are a few small things that could be changed but largely the third party scripts that have been added to the site (and are completely out of our hands as developers) are making it appear as though their site is performing about as poorly as possible. Irrational scores like this make me constantly have to explain that a single image that can be compressed to save a whopping 1kb of data shouldn’t be reducing a site’s image optimization score down to 60%, but that’s exactly what it does. What’s worse, is that Google can then use these asinine scores to determine the page ranking of your website. Say that again – if you don’t use Google’s proprietary image extension, they may be pushing your page down in search results.
In GTMetrix’s own words, they say that Lighthouse is designed to rate how a page actually loads for the user, rather than how it is built. But that couldn’t be further from the truth. Take a look at this real-world example of two sites I manage:
Site A – scoring 39
Site B – scoring 95
I am using a relatively very slow internet connection right now (~12mbps) at my country home, and the user experience in loading these two pages is roughly the same. Neither appear particularly slow or fast. But the reality is that Site B has a large 33.1mb HTML5 video loading in the masthead area that I know for a fact was not optimized properly. It is the slower loading site, and the performance issues are obvious, measurable, and solvable. But according to the new GTMetrix Lighthouse rating system, a system that is supposed to be more accurate, Site B is getting a solid “A” while Site A is getting a very low “E” – which is a special rating for sites that are worse than an “F.” It would be one thing if the sites were in the same ballpark, that at least would be reasonable. But they aren’t even on the same planet in terms of there scores, so what is the value of GTMetrix to developers now that it doesn’t actually provide any objective measure of a site’s performance? In their own blog post they explain how PageSpeed only showed how well a site was built, not how well it actually performs, but it did both and their own screenshot shows that it did both in very clear terms – three percentage based ratings for structure and three objective scores for load time, size, and requests. Just look at the vast disparity between these scores, many of which are using the same optimization techniques and all of which are getting good structure scores:
With the move to Lighthouse, it is highly unlikely that I continue to use GTMetrix for anything, and will assuredly end up cancelling my account. But that’s not the real issue. This change affects the profitability of all active optimization projects for every one GTMetrix’s loyal customers, and may force them to rework large parts of their business and edit marketing materials. Companies that focus solely on optimization now have to purchase credits to run more than three scans, rendering the test and contact business model dead in the water. If you deliver monthly optimization reports your clients are likely suddenly seeing horrible scores without any explanation as to why they have plummeted.
The bottom line
Developers need to be able to show that they did their job well in today’s development world, but if GTMetrix service can’t recognize the difference between a site with a 33mb video loading at the top of the page and a fairly efficient website, or worse, it thinks the efficient website is trash (what else would you call a score of 39?) and the unoptimized website is nigh perfection, how can you possibly rely on their service to represent your work?
I would rather not have to write such a scathing review, but the truth is that I am far from being the only developer who will be negatively affected by this change. My hope is that this article helps to serve as a wake up call for GTMetrix and potentially, Google, and that our voices as developers are heard.
We are the ones who are ultimately tasked with optimizing websites. As a paid subscriber, why didn’t we receive notifications in our dashboard area that this change was coming? Why weren’t we given an option of using the legacy software while we transition to another service (edit: this was added and announced in late November following backlash from customers, but like most major updates, customers should have been given the option to switch to Lighthouse, not the other way around). Both would have been thoughtful moves for their customers, but with this move it is clear that we are not a priority. Just look at the comments they received on their Facebook page announcement, do these look like happy customers?