Website Optimization

RIP GTMetrix

With the sudden switch to Lighthouse, the once revered optimization platform becomes unrecognizable and unusable.

Editors note: It would seem that the team at GTMetrix has listened to the developers that were impacted by the sudden website optimization platform revision, and have been working tirelessly to improve the proprietary parts of their testing platform to offer more relevant scores. They also worked closely with Valier to address optimization issues on our managed sites, and as a result our sites improved and GTMetrix improved their scoring system. While this post is relevant in the sense that it offers insight into the process that took place, we now hope that it also serves as a testament to GTMetrix’s commitment toward its customers. We will continue to monitor GTMetrix’s platform updates and will continue to discuss these changes here at valiermedia.com.

Today I went to complete an ongoing optimization project for a client of mine only to discover that the entire GTMetrix platform has been converted to use Lighthouse rather than PageSpeed and YSlow. While I am aware of the “advantages” that Google feels that Lighthouse offers (they think all of their software is the best), the service has never provided useful optimization reports for developers like myself, who are generally tasked with updating a site until it scores well in a trusted rating system. For years, that trusted rating system has been PageSpeed and YSlow through GTMetrix.

I have been an ardent supporter of GTMetrix for years, and have continuously found the reports offered through PageSpeed and YSlow to be vastly superior to the gross generalizations made by Google’s optimization platform. Google’s PageSpeed testing platform for example, pitches the need to switch to their proprietary image extension (webp), presenting this as a major issue with website performance when we all know what it is – another effort by Google to use their purchasing power to force more of the internet to be reliant on their tech. Many browsers don’t even support that image format! After over a decade of working as a developer, I can comfortably say that Google often uses their software to pitch services that they offer at every juncture, even if the service won’t actually serve their customer’s best interest.

Now let’s talk results.

Many sites that objectively perform well, and have had every available effort to optimize the site short of stripping it down to nothing perform terribly on the new GTMetrix rating system. This prevents developers from being able to explain to their clients that their site isn’t actually performing at 30%, that there are a few small things that could be changed but largely the third party scripts that have been added to the site (and are completely out of our hands as developers) are making it appear as though their site is performing about as poorly as possible. Irrational scores like this make me constantly have to explain that a single image that can be compressed to save a whopping 1kb of data shouldn’t be reducing a site’s image optimization score down to 60%, but that’s exactly what it does. What’s worse, is that Google can then use these asinine scores to determine the page ranking of your website. Say that again – if you don’t use Google’s proprietary image extension, they may be pushing your page down in search results.

In GTMetrix’s own words, they say that Lighthouse is designed to rate how a page actually loads for the user, rather than how it is built. But that couldn’t be further from the truth. Take a look at this real-world example of two sites I manage:

Site A – scoring 39
Site B – scoring 95

I am using a relatively very slow internet connection right now (~12mbps) at my country home, and the user experience in loading these two pages is roughly the same. Neither appear particularly slow or fast. But the reality is that Site B has a large 33.1mb HTML5 video loading in the masthead area that I know for a fact was not optimized properly. It is the slower loading site, and the performance issues are obvious, measurable, and solvable. But according to the new GTMetrix Lighthouse rating system, a system that is supposed to be more accurate, Site B is getting a solid “A” while Site A is getting a very low “E” – which is a special rating for sites that are worse than an “F.” It would be one thing if the sites were in the same ballpark, that at least would be reasonable. But they aren’t even on the same planet in terms of there scores, so what is the value of GTMetrix to developers now that it doesn’t actually provide any objective measure of a site’s performance? In their own blog post they explain how PageSpeed only showed how well a site was built, not how well it actually performs, but it did both and their own screenshot shows that it did both in very clear terms – three percentage based ratings for structure and three objective scores for load time, size, and requests. Just look at the vast disparity between these scores, many of which are using the same optimization techniques and all of which are getting good structure scores:

With the move to Lighthouse, it is highly unlikely that I continue to use GTMetrix for anything, and will assuredly end up cancelling my account. But that’s not the real issue. This change affects the profitability of all active optimization projects for every one GTMetrix’s loyal customers, and may force them to rework large parts of their business and edit marketing materials. Companies that focus solely on optimization now have to purchase credits to run more than three scans, rendering the test and contact business model dead in the water. If you deliver monthly optimization reports your clients are likely suddenly seeing horrible scores without any explanation as to why they have plummeted.

The bottom line

Developers need to be able to show that they did their job well in today’s development world, but if GTMetrix service can’t recognize the difference between a site with a 33mb video loading at the top of the page and a fairly efficient website, or worse, it thinks the efficient website is trash (what else would you call a score of 39?) and the unoptimized website is nigh perfection, how can you possibly rely on their service to represent your work?

I would rather not have to write such a scathing review, but the truth is that I am far from being the only developer who will be negatively affected by this change. My hope is that this article helps to serve as a wake up call for GTMetrix and potentially, Google, and that our voices as developers are heard.

We are the ones who are ultimately tasked with optimizing websites. As a paid subscriber, why didn’t we receive notifications in our dashboard area that this change was coming? Why weren’t we given an option of using the legacy software while we transition to another service (edit: this was added and announced in late November following backlash from customers, but like most major updates, customers should have been given the option to switch to Lighthouse, not the other way around). Both would have been thoughtful moves for their customers, but with this move it is clear that we are not a priority. Just look at the comments they received on their Facebook page announcement, do these look like happy customers?

🎉 Welcome to the new GTmetrix – by powered by Lighthouse. Learn more about our new performance metrics, new test…

Posted by GTmetrix on Tuesday, November 17, 2020

Anti-lytics: Let Data Inform Decisions, Not Make Them

The rise of analytics has ushered in a new era of business management. Once only obtainable through expensive market research, data is now available, and used by businesses of all shapes and sizes. Google Analytics, when used properly, is one of the most commonly used and powerful analytics tools on the market. Its also fucking free. Everyone is, or should be, using it. Sadly, many website owners misinterpret their website analytics data and rush off to hire a web developer fix things that aren’t actually broken.

Many website owners look to see improvements in two categories: pageviews and conversions. After all, a store that gets busier and sells more product is growing right? Perhaps, but if pageviews or conversions are not increasing, does that then mean that the business is dying? You see, this becomes an area where you want to dig deeper into the data and perform experiments in order to discover what is working and what isn’t. Maybe your product is missing a key feature. Maybe the prices are too high. Maybe your traffic is coming from off-topic search terms or bad advertising. All of this can be researched through a careful examination of the existing numbers.

Seasons Change, Data Does Too.

First, compare your data trends to previous quarters, then to one year ago, and then two years ago. Most websites go through natural fluctuations on a seasonal basis. For me, November and December are usually big months, while April and May are usually pretty slow by comparison. This is likely due to tax season fluctuations. Having been through a few years of this, I now know what to expect but initially I was caught off guard!

New websites haven’t been around long enough to demonstrate how the seasons will impact your analytics data. Your first year, your first quarter, and your first month are all experiments. You don’t have a large enough sample size at this point to fully understand the data, so you will have to rely on your professional instincts and short term experimentation to draw strong conclusions.

Where’s Your Traffic Coming From?

Where your traffic is coming from makes a big difference. Often times, this is reflected in your bounce rate. Think about it this way, if you ran an ad on the radio for a sale on cars but you actually run a billiards store, you would likely get a bunch of visitors who immediately turn around and leave once they see that you don’t have what they want. Your bounce rate is just that: customers who bail right away.

One of the best ways to analyze your bounce rate is to look at individual traffic sources and see where they are coming from. Start from your highest traffic pages and highest bounce rate pages, and go down the list. Are the visitors arriving through organic searches? Paid advertising? What are the search terms that are attracting them? For example, a user searching for “restaurants near me” and landing on a web page that talks about restaurants in a different state are going to leave. These people are hungry. Don’t do that to them, and don’t do that to your bounce rate, even if it means that your pageviews are improving.

Run experiments through A-B Testing

If you feel confident that your product is priced properly and your traffic is consistently coming from good sources, then its time to analyze the design of your site. To do this effectively, run an A-B test using Google Experiments so that you can compare and contrast page style and content variations so that you aren’t left guessing. The results will show you which page variation performed better, helping you to understand which parts of your website design are working better than others.

One good way to see if your website design is impacting your sales is to run a short A-B test where you put your product in a very simplified page. I’m not talking a page that sucks, just a page that is ultra minimalist – like this page. A page that this leaves little for the user to be turned off by in terms of design, so if it performs as poorly as your main product page, then you can bet that your website design isn’t impacting your conversion rate as much as your product is. Here’s the thing – a good web designer or developer is going to push you to isolate variables and take a good look at your entire online presence rather than just jumping into redesigning the site. This isn’t because they are proud of the site and are unwilling to change it, rather that they are willing to tell you not to spend your hard earned money on their services if they aren’t going to solve the problem.

…and cut yourself some slack!

The internet is an extremely competitive and confusing place. In 2019, there are a 1.8 billion sites and 51.8% of all internet traffic comes from bots. All Cyberdyne jokes aside, this means that most of the traffic coming to your site could be completely irrelevant. In the next article, I will show you how you can configure Google Analytics to filter out bot traffic and get closer to the truth!

– Merritt

Valier (pronounced “Va-leer”) is a boutique graphic design and website development studio focused on creating unique projects for unique clients. We work with companies and individuals that are pushing the boundaries within their industry and are looking for a partner in media development that can inject life and creativity into their marketing presence. With over 10 years of experience in the graphic design and website development industries, Merritt Lentz (Founder), has a proven track record of producing successful and innovative projects for a wide variety of clients ranging from artists and ski companies to government agencies and payment processing companies. Regardless of the size or complexity of your vision, we will help you hone in on a digital actualization of that vision and deliver a product that is rich and captivating.