Mike’s Tuesday Tips:
Last week was about identifying CLS and I mentioned how to identify LCP (Largest Contentful Paint) on your pages. This week, we are going to identify common issues that cause a slow LCP.
LCP is probably the simplest of the Core Web Vitals to deal with.
Having addressed this on a few hundred pages over the past year now, I can tell you that the most common cause of a slow LCP loading time is a slow server response.
If you are on a shared hosting environment, especially ones that have been oversold (and often overhyped) - I’m looking at you Siteground - you can tweak things all you want, but there is only going to be so much speed you can squeeze out of the server.
Want to figure out if your server has been oversold? Take a look at how many domains are hosted on your same server using this tool: https://viewdns.info/reverseip/
There are tons of other tools out there like this. If you really want to investigate, you can run all the sites through something like Semrush and take a look at their traffic estimates. You may have one or two sites on the same server getting tons of traffic and hogging up a huge amount of resources.
If you really care about pagespeed, one of the best things you can do is get away from shared web hosting.
Before someone comments about how they got good scores with shared hosting on Core Web Vitals, Pagespeed Insights, GTMetrix, or any other speed test you want to mention…. Sure, I believe you. However, think of just how much better your pages would load if you moved to a decent VPS or dedicated hosting solution.
There is an example below for a new client I just started working on. This site is hosted on WPEngine, which is supposed to be one of the better hosts out there. We are getting server response times ranging from 1.5-2.2 seconds. Just fixing that alone without doing any other tweaks, will bring their LCP score in line with Google’s standards.
Before we go crucifying WPEngine, there is also a possibility that the problem is on the development side of the site design. There might be some processes being called that have to complete server side before anything loads. With a lot of dynamic content, that can happen sometimes.
A database cleanup may also solve some of that response time.
If you do not want to switch hosts, another solution that may work is to use a content delivery network (CDN). Your mileage and experience may vary with these. For example, I have had some sites I have put on Cloudflare and saw drastic improvements. I have used it for other sites and it has actually slowed them down.
For any styles that are critical to your LCP and/or above the fold content, you can inline them, which means you place the style elements directly in the <head> of the page.
The next thing I see frequently slowing down LCP times are images and/or videos. Make sure that you optimize and compress your images.
Browsers load full images before adjusting them to the proper size to be viewed. If you are using an image that is 2000 x 1330 pixels, but it is only viewed at 600 x 400 in your page design, the browser is going to load that full 2000 x 1330 sized image. Before you bother with compressing anything, make sure you are using appropriate image sizes. Resize the image and then upload it back to your server.
You can also lower the image quality. Load it up in Photoshop or something like GIMP and change the resolution by adjusting the pixels/inch. Many times you will find that a lower resolution still looks great on your web page, and it will be a much smaller file.
One little trick I sometimes will use if I notice that changing resolutions on an image causes the quality to take a noticeable dip is I will make it a part of the design. I will toss a dark overlay over it and/or make it slightly blurry. I’ll do this if the image is being used as the background for a section. It helps the text over it to pop out anyhow.
If you are using Wordpress, there are a lot of plugins and options out there for compressing images. There are also options for serving next-gen image options, mainly WebP images. WebP images are not supported in all browsers, so make sure you have JPG or some other format as backups to display.
If you are using a video or slideshow, stop it. They are not a great user experience on mobile devices.
Lastly, use a tool like GTMetrix to investigate the loading order of elements on your page. I hate GTMetrix scores and the fact that they default to desktop loading. GTMetrix is pretty useless for everything other than its waterfall display. There are other tools that have waterfall displays, but I find GTMetrix the easiest to work with.
Take a look at what is loading before your LCP element. Are those things necessary? Is there anything that can be deferred or just eliminated?
I’ve shaved significant times off of LCP scores just by getting rid of Google Fonts. Google Fonts are great, but they have to load from Google’s servers. Then if you use different font weights that’s an extra library to load.
Another common one that slows down pages are things like Font Awesome icon libraries. A lot of page builders like Elementor will give you the option to use icons from Font Awesome, Themify, or Ionicons.
The problem is that in order to use just one icon, the entire library is loaded. Use a single image instead. Some builders will let you use your own SVG files as icons like Oxygen and Bricks. I think Elementor just added that option recently too.
The advantage of using your own is that the browser only has to load what you are using and not an entire library of icons.
I see this happen a lot with local business websites. They often like to use one of those phone icons beside their phone number in the header. Sometimes an email icon beside an email address too or the Google Places pin beside an address.
Because it loads in the header, this usually will slow down the LCP time. Use your own icons instead and speed it up.
How to Identify Cumulative Layout Shifts
With Core Web Vitals upon us, people are scrambling to optimize their sites. Mostly a waste of time, but it is what it is.
Out of the 3 Core Web Vitals, cumulative layout shift (CLS) is the one I have seen people having the most trouble identifying. Obviously, seeing your score is easy in Pagespeed Insights, Lighthouse, or web.dev, but now how do you identify what is actually causing any shifts on your pages?
It’s pretty simple actually. To do so, you are going to want to use Chrome’s Web Developer Tools.
-Open the page you want to check in Chrome.
-Click the dropdown menu on Chrome.
-Go to More tools >> Developer tools
This should open you up into a screen that shows Lighthouse and a bunch of other options along a menu at the top.
-Click on the Performance tab at the top.
-Then click on what looks like a refresh icon.
-Let it load, and you will end up with something like the attached screenshot.
If you have cumulative layout shifts happening, they will appear as red bars under the Experience row. You can zoom in on them a little bit by hovering your mouse over that area and using the mouse scroll wheel. (At least on PC’s. I have no idea how to do it on inferior Mac machines.)
You can also click and drag the whole thing around if things start moving off the screen as you zoom in.
If you hover over the red bars, in the website pane on the left it will highlight where that shift is happening.
By the way, while you are here, you can also identify your Largest Contentful Pain (LCP) element as well. In the Timings row, you will see a black box labeled LCP. Hover over it and it will highlight your LCP element.
Mike’s Tuesday Tips:
When should you disavow links?
Back in 2012, Google shook up the link building market with two massive actions. First, there was an enormous push to take down popular public networks. Anyone remember Build My Rank or Authority Link Network?
Then in April, they unleashed the Penguin algorithm filter and sent many SEOs running around with their hair on fire.
The Penguin algorithm was harsh. Probably too harsh to be honest. It weaponized link building. It was risky, but you could potentially get competitors penalized by throwing spammy links at them. I say it was risky because Google’s algorithm was far from perfect. You could just as easily strengthen their position as harm it.
While the Penguin algorithm did a great job in many cases of punishing sites using low quality links, there were also a lot of innocent sites caught in the mix or sites who had hired an SEO not understanding that they used spammy link building.
As a result, Google released it’s Disavow Tool in October of that year.
Fun fact: Did you know that Bing actually released a Disavow Tool before Google? Yep. Bing’s came out in September of 2012.
Since it’s release, people have debated its use. Early on, many of us cautioned against using it. Google generally does not tell you which links they have flagged as bad, except in some cases of manual penalties where they may give you a few examples.
Overuse can actually hurt your rankings.
(Some of us also suggested caution because we saw it as Google crowdsourcing to fix a problem they couldn’t figure out on their own. Basically they were saying, “Hey, why don’t you tell us which links are bad that you have been building? In exchange, here is a get out of jail free card.”
I think our concerns were valid. A couple of years ago Google announced that they can pretty well identify bad links on their own now and just ignore them. Where do you think the data came from to train their AI and machine learning algorithms to do that?)
Matt Cutts made a great analogy for how to use the tool. I’m paraphrasing, but he said you should use it like a scalpel, not a machete.
There are only two cases where you should use the Disavow Tool.
The first case is when you have received a manual penalty from Google related to link building. If this happens, you should try to actually have the offending links removed by those websites and fall back on the Disavow Tool for the ones you cannot get removed.
The second case where you should use the Disavow Tool is when you see a massive drop in rankings AND you have seen some low quality links starting to pile up or maybe there was a recent influx of low quality links.
If you have a page or pages hit by the Penguin filter because of bad links, you won’t see slight ranking drops. If you see drops of just a few spots, it’s not your links. You won’t drop from #1 to #3. You will see something more like drops of 50 spots or more. Sometimes you will drop out of the top 100 completely.
In these cases, again, the best solution is to try to get links removed, but in cases involving hundreds or thousands of spammy links coming in, that will probably not work.
You can use the Disavow Tool.
How do you know which links to disavow?
Well, Semrush has a great Toxicity filter you can look at, but do not just disavow all links it identifies as ‘toxic’. Use this filter as an indicator for links you should take a look at yourself.
Only disavow links you have manually inspect yourself.
Do not use 3rd party metrics like DA to identify low quality links. DA has nothing to do with the quality of a link (nor does any other 3rd party metric). If anything, those metrics are trying to give you a gauge for the potential strength of a link. Strength and quality are not the same thing.
How do you recognize low quality or spammy links? Well, it’s a lot like United States Supreme Court Justice Potter Stewart famously said about pornography, “I know it when I see it.”
Is the content a jumbled mess? Is the link and content at all relevant to what you do? Does the page even load? In short, if a prospect who had never heard of you came across the page and saw your link, would it hurt your brand image? Would you be embarrassed to be mentioned on that page?
I consider links like blog comments, forum posts, social bookmarks, document sharing sites, and all insignificant wikis to be spam worth disavowing too.
Lastly, if you do decide to disavow links, remember that your disavow file is kind of a living document. When you upload a file, it replaces the old one. The Disavow Tool does not store the old data. If you decide to disavow additional links, you should keep adding on to the same document and upload that file.