Jump to content

Search the Community

Showing results for tags 'mikes tuesday tips'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome
    • New Members Start Here
  • Marketing and Business Forums
    • Main Internet Marketing Discussion
    • Search Engine Optimization
    • Paid Traffic and Advertising
    • Social Media Marketing
    • Writers Guild
    • Sales / Offline Marketing
    • Investments
    • Product and Services Reviews
  • Clicked Marketplace
    • Clicked Members Wanted
  • For Developers
    • Programming
    • Web Design
  • Clicked Inner Circle (Private)
    • Inner Circle News
    • SEO Chat
    • Making Money Online
    • Business Corner
    • Case Studies
  • Clicked Training Courses (Private)
    • SEO Training Course
    • Private Network Course
  • Clicked Support Forum
    • Clicked News
    • Clicked Suggestion And Help Forum
  • Off Topic
    • The Bar

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Interests

Found 19 results

  1. Mike’s Tuesday Tips: Last week was about identifying CLS and I mentioned how to identify LCP (Largest Contentful Paint) on your pages. This week, we are going to identify common issues that cause a slow LCP. LCP is probably the simplest of the Core Web Vitals to deal with. Having addressed this on a few hundred pages over the past year now, I can tell you that the most common cause of a slow LCP loading time is a slow server response. If you are on a shared hosting environment, especially ones that have been oversold (and often overhyped) - I’m looking at you Siteground - you can tweak things all you want, but there is only going to be so much speed you can squeeze out of the server. Want to figure out if your server has been oversold? Take a look at how many domains are hosted on your same server using this tool: https://viewdns.info/reverseip/ There are tons of other tools out there like this. If you really want to investigate, you can run all the sites through something like Semrush and take a look at their traffic estimates. You may have one or two sites on the same server getting tons of traffic and hogging up a huge amount of resources. If you really care about pagespeed, one of the best things you can do is get away from shared web hosting. Before someone comments about how they got good scores with shared hosting on Core Web Vitals, Pagespeed Insights, GTMetrix, or any other speed test you want to mention…. Sure, I believe you. However, think of just how much better your pages would load if you moved to a decent VPS or dedicated hosting solution. There is an example below for a new client I just started working on. This site is hosted on WPEngine, which is supposed to be one of the better hosts out there. We are getting server response times ranging from 1.5-2.2 seconds. Just fixing that alone without doing any other tweaks, will bring their LCP score in line with Google’s standards. Before we go crucifying WPEngine, there is also a possibility that the problem is on the development side of the site design. There might be some processes being called that have to complete server side before anything loads. With a lot of dynamic content, that can happen sometimes. A database cleanup may also solve some of that response time. If you do not want to switch hosts, another solution that may work is to use a content delivery network (CDN). Your mileage and experience may vary with these. For example, I have had some sites I have put on Cloudflare and saw drastic improvements. I have used it for other sites and it has actually slowed them down. The second big issue causing slow LCP load times is render blocking JavaScript and CSS. A browser will pause parsing HTML when it encounters external stylesheets and synchronous JavaScript tags. To speed up the loading of your LCP, you want to defer any non-critical JavaScript and CSS files. You should also minify and compress your CSS and JavaScript files. For any styles that are critical to your LCP and/or above the fold content, you can inline them, which means you place the style elements directly in the <head> of the page. The next thing I see frequently slowing down LCP times are images and/or videos. Make sure that you optimize and compress your images. Browsers load full images before adjusting them to the proper size to be viewed. If you are using an image that is 2000 x 1330 pixels, but it is only viewed at 600 x 400 in your page design, the browser is going to load that full 2000 x 1330 sized image. Before you bother with compressing anything, make sure you are using appropriate image sizes. Resize the image and then upload it back to your server. You can also lower the image quality. Load it up in Photoshop or something like GIMP and change the resolution by adjusting the pixels/inch. Many times you will find that a lower resolution still looks great on your web page, and it will be a much smaller file. One little trick I sometimes will use if I notice that changing resolutions on an image causes the quality to take a noticeable dip is I will make it a part of the design. I will toss a dark overlay over it and/or make it slightly blurry. I’ll do this if the image is being used as the background for a section. It helps the text over it to pop out anyhow. If you are using Wordpress, there are a lot of plugins and options out there for compressing images. There are also options for serving next-gen image options, mainly WebP images. WebP images are not supported in all browsers, so make sure you have JPG or some other format as backups to display. If you are using a video or slideshow, stop it. They are not a great user experience on mobile devices. Lastly, use a tool like GTMetrix to investigate the loading order of elements on your page. I hate GTMetrix scores and the fact that they default to desktop loading. GTMetrix is pretty useless for everything other than its waterfall display. There are other tools that have waterfall displays, but I find GTMetrix the easiest to work with. Take a look at what is loading before your LCP element. Are those things necessary? Is there anything that can be deferred or just eliminated? I’ve shaved significant times off of LCP scores just by getting rid of Google Fonts. Google Fonts are great, but they have to load from Google’s servers. Then if you use different font weights that’s an extra library to load. Another common one that slows down pages are things like Font Awesome icon libraries. A lot of page builders like Elementor will give you the option to use icons from Font Awesome, Themify, or Ionicons. The problem is that in order to use just one icon, the entire library is loaded. Use a single image instead. Some builders will let you use your own SVG files as icons like Oxygen and Bricks. I think Elementor just added that option recently too. The advantage of using your own is that the browser only has to load what you are using and not an entire library of icons. I see this happen a lot with local business websites. They often like to use one of those phone icons beside their phone number in the header. Sometimes an email icon beside an email address too or the Google Places pin beside an address. Because it loads in the header, this usually will slow down the LCP time. Use your own icons instead and speed it up.
  2. How to Identify Cumulative Layout Shifts With Core Web Vitals upon us, people are scrambling to optimize their sites. Mostly a waste of time, but it is what it is. Out of the 3 Core Web Vitals, cumulative layout shift (CLS) is the one I have seen people having the most trouble identifying. Obviously, seeing your score is easy in Pagespeed Insights, Lighthouse, or web.dev, but now how do you identify what is actually causing any shifts on your pages? It’s pretty simple actually. To do so, you are going to want to use Chrome’s Web Developer Tools. -Open the page you want to check in Chrome. -Click the dropdown menu on Chrome. -Go to More tools >> Developer tools This should open you up into a screen that shows Lighthouse and a bunch of other options along a menu at the top. -Click on the Performance tab at the top. -Then click on what looks like a refresh icon. -Let it load, and you will end up with something like the attached screenshot. If you have cumulative layout shifts happening, they will appear as red bars under the Experience row. You can zoom in on them a little bit by hovering your mouse over that area and using the mouse scroll wheel. (At least on PC’s. I have no idea how to do it on inferior Mac machines.) You can also click and drag the whole thing around if things start moving off the screen as you zoom in. If you hover over the red bars, in the website pane on the left it will highlight where that shift is happening. By the way, while you are here, you can also identify your Largest Contentful Pain (LCP) element as well. In the Timings row, you will see a black box labeled LCP. Hover over it and it will highlight your LCP element.
  3. Mike’s Tuesday Tips: When should you disavow links? Back in 2012, Google shook up the link building market with two massive actions. First, there was an enormous push to take down popular public networks. Anyone remember Build My Rank or Authority Link Network? Then in April, they unleashed the Penguin algorithm filter and sent many SEOs running around with their hair on fire. The Penguin algorithm was harsh. Probably too harsh to be honest. It weaponized link building. It was risky, but you could potentially get competitors penalized by throwing spammy links at them. I say it was risky because Google’s algorithm was far from perfect. You could just as easily strengthen their position as harm it. While the Penguin algorithm did a great job in many cases of punishing sites using low quality links, there were also a lot of innocent sites caught in the mix or sites who had hired an SEO not understanding that they used spammy link building. As a result, Google released it’s Disavow Tool in October of that year. Fun fact: Did you know that Bing actually released a Disavow Tool before Google? Yep. Bing’s came out in September of 2012. Since it’s release, people have debated its use. Early on, many of us cautioned against using it. Google generally does not tell you which links they have flagged as bad, except in some cases of manual penalties where they may give you a few examples. Overuse can actually hurt your rankings. (Some of us also suggested caution because we saw it as Google crowdsourcing to fix a problem they couldn’t figure out on their own. Basically they were saying, “Hey, why don’t you tell us which links are bad that you have been building? In exchange, here is a get out of jail free card.” I think our concerns were valid. A couple of years ago Google announced that they can pretty well identify bad links on their own now and just ignore them. Where do you think the data came from to train their AI and machine learning algorithms to do that?) Matt Cutts made a great analogy for how to use the tool. I’m paraphrasing, but he said you should use it like a scalpel, not a machete. There are only two cases where you should use the Disavow Tool. The first case is when you have received a manual penalty from Google related to link building. If this happens, you should try to actually have the offending links removed by those websites and fall back on the Disavow Tool for the ones you cannot get removed. The second case where you should use the Disavow Tool is when you see a massive drop in rankings AND you have seen some low quality links starting to pile up or maybe there was a recent influx of low quality links. If you have a page or pages hit by the Penguin filter because of bad links, you won’t see slight ranking drops. If you see drops of just a few spots, it’s not your links. You won’t drop from #1 to #3. You will see something more like drops of 50 spots or more. Sometimes you will drop out of the top 100 completely. In these cases, again, the best solution is to try to get links removed, but in cases involving hundreds or thousands of spammy links coming in, that will probably not work. You can use the Disavow Tool. How do you know which links to disavow? Well, Semrush has a great Toxicity filter you can look at, but do not just disavow all links it identifies as ‘toxic’. Use this filter as an indicator for links you should take a look at yourself. Only disavow links you have manually inspect yourself. Do not use 3rd party metrics like DA to identify low quality links. DA has nothing to do with the quality of a link (nor does any other 3rd party metric). If anything, those metrics are trying to give you a gauge for the potential strength of a link. Strength and quality are not the same thing. How do you recognize low quality or spammy links? Well, it’s a lot like United States Supreme Court Justice Potter Stewart famously said about pornography, “I know it when I see it.” Is the content a jumbled mess? Is the link and content at all relevant to what you do? Does the page even load? In short, if a prospect who had never heard of you came across the page and saw your link, would it hurt your brand image? Would you be embarrassed to be mentioned on that page? I consider links like blog comments, forum posts, social bookmarks, document sharing sites, and all insignificant wikis to be spam worth disavowing too. Lastly, if you do decide to disavow links, remember that your disavow file is kind of a living document. When you upload a file, it replaces the old one. The Disavow Tool does not store the old data. If you decide to disavow additional links, you should keep adding on to the same document and upload that file.
  4. There are two very simple and very logical reasons why Google and other search engines do not factor bounce rate into their ranking algorithm. The first reason is just a matter of access. Google does not have access to bounce rate data on many of the webpages in existence. For some reason, many people seem to think of Google as this omnipotent power that sees and knows all. That is just not the case. There are over 200 ranking signals in Google’s algorithm, all of which have different weightings. When a search query is made, Google is pulling data from its index and comparing all of the websites it has indexed based on those 200 signals. If Google were using bounce rate data, how would the algorithm compare a webpage where it has bounce rate data versus a webpage in which it has no bounce rate data? Which one is performing “better” for that ranking signal? Still not convinced? Fine. Let’s look at the second reason that Google is not using bounce rate in its ranking algorithm. Bounces are not always bad. They are not always a signal that there is something wrong with the page. For some reason, many marketers have this stigma stuck in their head that all bounces are bad. They are not. Let’s say you are running an emergency plumbing service and repair business. Someone in your community has a toilet that has suddenly started to overflow and they cannot fix it. They search for a local plumber in Google, see your page ranking first. They click the search result which brings them to the home page of your site. They like what they see and pick up the phone to call you (or your office) and see how fast you can help them out with their problem. They never visited another page on your site. They will register as a bounce, but they did exactly what you wanted them to do and they found exactly what they needed, right? Your webpage converted them immediately into a phone call and a possible job. That’s a good thing. And why should Google see it differently or ding your site for that? The same thing could be said if I am running an affiliate site. Usually an affiliate site is setup to drive traffic to a landing page and get them to click on an affiliate link. If they do not browse around on your site but click on the link, they are going to register as a bounce. Again, there is nothing wrong with that. They did exactly what you were hoping they would do. We obviously do not have access to their analytics to prove it, but look at a site like Wikipedia. I would venture a guess that their bounce rate is quite high. People generally end up at Wikipedia because they were looking for an answer to a specific query and one of Wikipedia’s pages came up. They visit the page and find the answer they were looking for. Some might click on an internal link on the page if they see something that interests them. The vast majority most likely do not and simply leave. Yet, Wikipedia ranks for everything. Does That Mean Bounce Rate Data is Useless? No. Not at all. Bounce rate data is useful for you. Not for search engines. A high bounce rate could be indicative of a problem on a webpage. It really depends what type of website you are running and what it is you are trying to get visitors to do. If you are running an ecommerce site where a particular page is bringing in a lot of traffic, but then the visitors are leaving without browsing other products, adding anything to their cart, etc., then there is likely something wrong with that page or the traffic coming to that page. Even then, believe it or not, it may not be a bad thing. You always want to take a closer look. I’ve relayed this story before, but I will share it again here. Before you go reworking a whole page or website, it is important to understand where the bounces are coming from. Who is bouncing, how did they find your site, and what pages are they bouncing from? I was looking at a client’s website one time and noticed that the bounce rate across the site was 43%. Most of the pages fit around that number, but there was one page where the bounce rate was 89%. That was unusual. Average time on the site was over 6 minutes, but on this particular page it was under 30 seconds. I took a closer look at the analytics, and found that search traffic was bouncing from that page at a much, much higher rate than traffic from other sources. Generally, if there is something wrong with the page, the bounce rate will be consistent among all sources of traffic. This was not the case. Through some digging, we found that the page was not only ranking highly for our target keyword, but it was also ranking highly for another keyword that was similar but highly unrelated to the page. In other words, the words in the phrase were close, but the definitions were much different. I cannot reveal the client’s site, but the difference in keyword phrases would be something like doggy style versus styles of dogs. The words are close, but have two completely different meanings. The targeted phrase was searched about 500 times per month on average. The untargeted phrase was searched about 12,000 times per month. That’s why the percentage of bounces was so high. In this situation, it was nothing to worry about. The bounces were coming from untargeted traffic. This is a perfect example of why you really need to take a close look at what the bounces are actually telling you.
  5. There are some big misconceptions out there about how nofollow works, so let’s clear it up. The first image is a simplified version of how a page passes on authority to other pages through its links. Each page has a set amount of linkjuice that it can pass on through its links. When nofollow was first introduced, it blocked that linkjuice from passing through links carrying the nofollow tag, and it would instead be redistributed among the remaining links on the page, making them stronger, as shown here: Many of us used this for what was known as PageRank sculpting. We could control the flow of linkjuice throughout our sites to boost the pages we really wanted to rank. Of course, Google didn’t like that, so in 2009 they changed how they handled nofollow. Here is how it is done now: Linkjuice still flows through links tagged as nofollow. It no longer gets redistributed among the remaining links on the page, but it does not get credited to the target page they are linking to. This is why it is a bad idea to nofollow internal links. You are actually bleeding out linkjuice by doing so. For some reason, people still think Google treats nofollow as illustrated in the first image, but that has not been the case since early 2009. Then there was the update in March of 2020, where Google again changed how they treat the nofollow tag. Up until then, it was treated as a directive. With the latest update, they instead treat it as a hint or request. They make up their own mind whether to treat a link as nofollow or to ignore the tag. They will never tell you if you they are obeying the nofollow tag or not on any links, so we have no idea if a link is really nofollow or not. They also added additional identifiers they want webmasters to use to identify sponsored links, affiliate links, etc. Can you still sculpt PageRank? Some of the min/maxers out there who really want to squeeze out every little value they can have found ways to sculpt PageRank even after Google changed how they handle nofollow. For a long time, Google had trouble parsing javascript. A common technique was to put lower-value links inside javascript code so that Google could not see them. People would do this for links to things like contact us, privacy, and terms of service pages. Google has gotten better at reading javascript, so this method really does not work anymore. The other way it was commonly done was to use iframes. Googlebot always skipped over iframes, so you could use them to hide links with less importance and sculpt PageRank that way. For years, the footer and parts of the header of Bruce Clay’s site used iframes to do this. Google does seem to read the content inside iframes these days, although I have seen some tests where they did so inconsistently. This method could still work, but it’s just not 100% reliable.
  6. Mike’s Tuesday Tips: Should you noindex category pages? I see this question come up a lot in regards to Wordpress, but the situation would be similar no matter what CMS you might be using. It depends on how you are using your categories. Most sites I see are using categories as part of their navigation or a sub-navigation. In those cases, you absolutely should not noindex the category pages. Different people from Google have said slightly different things in regards to this, but there are two messages we have heard pretty consistently over the past few years. First, if you noindex category pages, Google will likely treat them as soft 404s eventually. Not a huge deal, but it can trigger errors in Search Console. Just be aware of that. Second, over time, if you noindex category pages, Google will treat the links on those pages as nofollow. This is why I say it depends how you are using your category pages. If you have links pointing to your category pages (like you would if you use them in any type of navigation menu), you are pushing link equity into those pages, but nothing is coming back out of those pages. You are bleeding link equity. This can harm the internal link structure of your site. Simple rule of thumb: Do you have category pages that visitors might land on by following a link somewhere on your site? If the answer to that question is “yes”, then do not noindex them. If there is no way for visitors to find your category pages other than through your sitemap or by typing the URL directly into their browser, then it does not really matter if you noindex them or not. Objections: I often hear people say that they do not want to index their category pages for 1 of 2 reasons: Reason 1 - The page is low quality and full of nothing but duplicate content. Solution: Then make your category pages into something useful. Build them out more. Include some static content on the pages and not just nothing but post excerpts. Reason 2 - A category page is outranking the primary page they want to rank for a keyword. Solution: Do a better job optimizing your target page. It should not be difficult to outrank your own category page. Or push them both up higher and get them both ranking highly.
  7. This is an easy one, but one I get asked about a lot. How many links should you build per day? There is only one correct answer. As many good links as you can possibly get each day. Period. There are no exceptions. No buts. I don't care if the site is brand new or 20 years old. Google does not care about link velocity. Notice that I said good links. If you are using spam like blog comments, profile links, social bookmarks, etc., then yes link velocity matters because the faster you build them the faster you are likely to tip over the Penguin threshold. On the other hand, if you go slow, you will likely never rank anyhow. On top of the fact that Google doesn't care how many links you build per day, they also are not going to find all of your links at once anyhow. Some they might find the same day they go live. Others it might take them 3-4 weeks to discover. You can't control when links will be discovered, so trying to stick to some arbitrary number per day is silly anyhow.
  8. Sticking with the more grey/black hat theme of last week, let’s talk about some footprints that can be giving away your private network. Many of you have your own private network or have thought about creating one. No single one of these footprints will necessarily bring the Google Hammer crashing down on your head, but when you start combining them, they can make identifying network sites really easy for Google. -WordPress. There is no doubt it is a popular platform. It is the most common platform people are using to build their networks on. You do not have to avoid it completely, but if you combine it with these other possible footprints, you might be drawing unwanted attention. -Text logo/header. Most people just use the default text style header in WordPress. They are not taking the time to design a graphic header instead. -All posts are on the homepage. Owners of private networks get greedy. They want to squeeze as much link equity out of each site as they can. To that end, they put every one of their posts on the homepage of their sites. As a result, all of their external links are also on the homepage of the sites. This is also common when someone is selling links on a network. -Sample Page and Hello World post still exist. This is specific to WordPress, but I cannot count how many times I have stumbled on a network site where the default Sample Page and Hello World post are still published. A real site that someone cares about is not going to have those (usually). That is just carelessness. -About page. Common network sites often do not have an “About” page. -Privacy page. See above. -Every post has at least one external link. Now there are some legitimate websites that follow this pattern. Many news sites have links within just about every news story. However, when you combine this with the majority of the footprints in this section, it is just one more ding against you. -No social activity. Most network sites have no Facebook, Twitter, Instagram, etc. site attached to them. Think about real sites you come across and how few do not have these things. -Wide range of topics. This is another one where there are some legitimate websites that cover all sorts of topics, but many of the public network sites out there do it in an extremely jumbled way with no real organization or reasoning behind it. -No internal linking. Outside of a navigation menu and things like “recent posts” widgets, network sites commonly have no internal linking between posts. That’s a huge mistake. -Having the same external link profile. I have uncovered private network sites that are all linking to the exact same sites. How odd would it be to find 25+ sites that are all linking to the exact same 3-4 websites? Vary up your external link profile. Camouflage it. -Blocking robots. This one is a little more controversial. I know a lot of people who build private networks like to block spiders from places like Semrush and Ahrefs. To me, this could be a footprint Google could use to identify network sites. There might be a very valid reason for blocking them in some cases, but you show me 20 sites that all link to the same money site and all are blocking bots from common backlink indexes, and I will bet you $1 you just found 20 sites of someone’s network. -Contact information inside cPanel. Here is one most people are not aware of. When you sign up for a hosting account, by default the email address used for signing up gets plugged into the contact information panel inside of cPanel. Change that email to something random. This contact information gets published publicly. If you do a search for an email address inside of quotes that you have used in multiple hosting accounts, all of your domains can be pulled up that way. I once found 300+ network sites owned by the same person this way. Again, for the most part, these footprints on their own are not a big deal (other than the last one), but if you take 7 Wordpress sites, with text logos, no About or Privacy page, no internal links, all linking to the same sites, and they are blocking Semrush, Ahrefs, Moz, and Majestic… Well, you likely found yourself someone’s (not so) private network.
  9. Sticking with PPC this week, specifically Google and Bing Ads. This is a technique I have been using for a long time. I came up with a catchy acronym for it when I teach it to people. A.I.M.: Analyze, Identify, Move Then several years ago I was reading Perry Marshall’s book on Google Ads. He uses the same method, but calls it Peel & Stick. Admittedly, his name is much more catchy. Call it whatever you want. The concept is the same. The way you implement this is simple. (STEP 1) You first Analyze the keywords of an ad group. What you are looking for is any keyword that sticks out. We are primarily looking at CTRs here. Note, most people who I have encountered that do use this method only do this for the top performers. However, you should also use this for keywords that are not performing well. (STEP 2) You want to Identify keywords that are outperforming the rest of the group or underperforming the rest of the group. (STEP 3) The third step is to Move these keywords into their own single keyword ad groups (SKAG). In the case of over performing keywords, you are doing this hoping that if you write an ad targeted specifically for this keyword you will be able to boost its performance even higher. In the case of underperforming keywords, you are moving them into their own ad groups hoping that by putting the keyword by itself with ads specifically suited for that keyword, that you can boost its CTR to something more respectable. Depending on the situation, I will go a step further and create a unique landing page for each of these as well. The thinking being that I can improve the Quality Score, which means potentially higher positions in the ad listings, which can then lead to higher CTRs. You do not want your ad groups cannibalizing one another. This is even more common with Google’s expanding definition of exact match. Make sure when you pull a keyword out of an ad group, you add its exact match as a negative keyword to that ad group.
  10. Mike’s Tuesday Tips Forget everything you think you know about tiered linking. Tiered linking might be a little more of a grey/black hat area for most of you, but not everything has to be white hat on your own projects (definitely should be for customers). For 15+ years most of the gurus out there have been teaching tiered linking completely wrong. Everyone from Matthew Woodward to the guys behind SENuke have been advocating the wrong structures for tiered linking. Even recently, SEO Powersuite published an article on their site doing it wrong. (They changed their article after I called them out on it. Lol.) Basically everyone on Fiverr or any other marketplace you go to are doing tiered linking in a way that makes it highly inefficient and likely to not work at all. The common way tiered linking is shown is to create somewhere around 4-6 sites on tier 1 which then link to your target page. Tier 1 typically consists of Web 2.0 sites or other decent quality pages that you have a good bit of control over. Tier 2 will then usually either consist of a bunch of Web 2.0 sites or just plain spam (bookmarks, blog comments, forum comments, wikis, etc.). Then tier 3 is more spam. There are two problems with this structure. First of all, when you put a bunch of sites on tier 2, you are greatly diluting everything that comes behind it. Let’s say you put 20 Web 2.0s on tier 2. Then you create 1,000 links on tier 3. Great, but really you are only pointing 50 links at each tier 2 property, and considering that you are using low quality links, 50 is not going to move the needle at all. (To be honest, with these sorts of links, even 50,000 links are not going to move the needle much, if at all. You need to think bigger.) But the really significant problem is that your low quality links should never be a part of the actual tiered structure. Things like blog comments and wikis get deleted frequently. Social bookmarks often do not get indexed by search engines. Any type of break in the structure means that any links flowing to that point are now lost. The way tiered linking should be done is to keep your actual tiers small and make them sites you have control over. Either domains that you own or Web 2.0s (you technically don’t have control over a Web 2.0, but it is unlikely to be taken away from you unless you do something stupid with it). Tier 1, should be one site which then links to your money page. Tier 2, 2 sites. Tier 3, 4 sites. Tier 4 (if you want to go that far), 8 sites. That’s it. No bookmarks. No blog comments. None of that junk goes into the tiers. Then you take all your low quality links that you want to use and you point those at the sites in the tiers. What this means is that if a blog comment link gets deleted, it’s not destroying everything in the structure behind it. It doesn’t matter. You are unlikely to ever lose anything that appears in the actual tiered structure.
  11. Think about your outreach targets. This one may seem very obvious to many of you, but I still see people making this mistake pretty regularly. Outreach for links is one of the most popular forms of link building out there. It’s also one of the most difficult, time consuming, and frustrating. I did a consulting call with another SEO in early May with the main thing they wanted to talk about being link building. They really wanted to talk about their outreach approach because they were getting horrible results. We are talking about close to zero percent response rates. I was not shocked at the results when we started talking about what they were doing, and they are far from the only people I have seen do this over the years. I asked them how they were finding their targets for their outreach. They were making a pretty big mistake. They were using different combinations of search terms and search operators, but all of them included their primary keywords. In other words, if they were trying to rank a page on a recipe site for “dessert recipes”, they were reaching out to other sites that already ranked for “dessert recipes.” Just think about that for a moment. I’m trying to rank my own page about dessert recipes and someone is reaching out to me asking if I will link to their dessert recipe page. Even if I was clueless about SEO, that sounds like a bad idea. Why would I want to lead my traffic to a similar page on another site? If the website owner does understand a little bit about SEO, they are going to know that doing so could potentially help you to outrank them. Why would they want to do that? Your outreach should not be to direct competitors. Many times it shouldn’t even be to sites that are directly in the same niche if you want to have a good success rate. Let’s use an example to illustrate what I am talking about. Say you are doing outreach for a local mortgage lending company in Philadelphia. The last thing you want to do to find outreach targets is to search for things like “mortgage lenders in Philadelphia” or “FHA loans Philadelphia”. Instead, look at businesses that are related to and in many cases rely on mortgage lenders to operate. Title companies and real estate agents would be good examples. Construction companies that focus on home building would be another good one. Content on their site that educates their web visitors about topics such as FHA loans, improving credit scores to qualify for a mortgage, what to prepare in order to get a pre-approval, or why they should get a pre-approval before home shopping, can help to position them as knowledgeable within the field and someone a prospect would want to work with. At the same time, neither title companies, real estate agents, or construction companies are going to be heavily focused on trying to rank for mortgage-related keywords. Going back to the recipe example, and specifically the “dessert recipes” example, there are plenty of branches within this market you can look at. For example, there are a bunch of websites (and YouTube channels) devoted entirely to recipes for pressure cookers. These are site owners that probably care about ranking for search terms like “dessert recipes for pressure cookers” but not as much about just “dessert recipes”. You can also find sites that do reviews and tutorials of cooking gadgets like pressure cookers, air fryers, slow cookers, etc. They could be good targets to reach out to. There are also all kinds of bloggers covering topics like eating healthy, being a stay-at-home mom (or dad), etc. All great outreach targets. You can get even more creative than this. Think a little outside the box. Remember that the entire website you are reaching out to does not have to be relevant to your site in order for the link to be useful. There are lots of sites popping up that are devoted to providing information about becoming an online streamer. Most of the content on these sites revolves around what equipment to use, how to set up that equipment, setting up a schedule, engaging your audience, finding an audience, etc. Many streamers will stream for 8-12 hours at a time. You could reach out to some of these sites and pitch them the idea of publishing a piece of content about some great bite-size, healthy snacks you can make to eat while streaming. Be creative. Think outside of the box. You will have a lot more success in your outreach.
  12. These are a few common mistakes I see people make in doing keyword research for SEO. Mistake 1: Only targeting keywords with X number of searches per month. I commonly see people say to look for at least 1000 searches per month. Whatever the number is, this thinking ignores two very important factors: buyer intent and what are you selling. I don’t think I need to explain buyer intent to anyone here. What I mean by what you are selling is simple. What if the lifetime value of one customer/conversion is $10,000? Do you really care about search volume then? I’m going after any keyword where the buyer intent is high. I don’t care if it gets 10 searches per month. I just need one conversion each month and that will generate a 6 figure revenue stream. Now on the other hand, if you are building a made-for-AdSense type site, then yes, search volume is going to matter a whole lot more. In fact, I would probably ignore anything less than 10,000 searches per month as a primary keyword. Mistake 2: Looking at the number of results in the search index. I covered this one before in another Tuesday Tip, but it is worth mentioning again. The number of results in the search index has absolutely nothing to do with the level of competition for a keyword. It does not matter if there are 100,000 results or 100,000,000 results. All that matters is the strength of the top 3 pages (or in really high search volume keywords maybe top 5). If you can beat #3, then #4 through 100,000,000 do not matter. I don’t care what search operators you use either. Inurl:, Intitle:, etc. It tells you nothing about the level of competition. The KGR is BS. Mistake 3: Using competition level from Google Keyword Planner. Over the years, this might be the mistake I see most often repeated. The competition column in the Google Keyword Planner has nothing to do with the level of competition in organic search. The Keyword Planner is a tool for Google Ads, not SEO. It is telling you the level of competition among Google advertisers. If you ever see a third-party tool with a “Competition” column and it ranks them as Low, Medium, or High, they are most likely pulling this data from Google. Same thing applies. If anything, and I would still be careful about this, that data can be used to gauge buyer intent. The thinking being that if advertisers are willing to pay for ads, then that probably means they are making money off their ads. In other words, people doing that search are looking to buy something. Mistake 4: Not checking the plural or non-plural version of a keyword. Sometimes, when you change a search term to its plural version, the search intent changes in Google’s eyes and so do the results. Based on this you might want to create different content on another page to target the plural version or you may want to not target it at all. For example, when I search ‘insurance agent’ I do get the local search box, but in the organic searches I get things like job listings, job descriptions, how to become one, and some local search results mixed in. When I search for ‘insurance agents’, I see nothing but local results on page one. If you just glanced at the search terms, they may seem closely related, but based on what Google is showing I would not create the same content to target both of those searches.
  13. Mike's Tuesday Tips: I have been amazed over the past few years how many nonprofits I have encountered that were not aware of the Google Ads Grant Program. https://www.google.com/grants/index.html What could a nonprofit do with $10,000 per month in advertising on Google Ads? Could they get the word out about their cause to more people who might be in need of their services? Could they recruit more volunteers? Could they bring in more donations? The Google Ads Grant Program provides nonprofits with the opportunity to advertise on Google Ads for free. The program gives qualified organizations up to $10,000 per month to promote their initiatives on Google. This can be a great source for extra traffic to your website to share your cause with your community or the world. It’s an opportunity that every nonprofit should be taking advantage of. In order to qualify for the Google Ads Grant Program, nonprofits must meet the following criteria: -Hold current and valid 501(c)(3) status. -Acknowledge and agree to Google Grant’s required certifications. -Have a website that defines your nonprofit and its mission. -Hospitals, medical groups, academic institutions, childcare centers, -and government organizations are not eligible. -(Google does provide a similar program specifically for educational institutions.) In 2018, Google made changes to the Google Ad Grant Program and there are new guidelines that nonprofits must follow to remain eligible: -Maintain a minimum 5% CTR account-wide. Accounts that dip below 5% for 2 consecutive months will be suspended. -Maintain a minimum keyword quality score of 2. -Have a minimum of 2 ad groups per campaign. -Have a minimum of 2 ads per ad group. -Utilize at least 2 sitelink ad extensions. The primary one that nonprofits struggle with is maintaining the 5% CTR. It is account-wide, so it is always advisable to have a brand campaign running for the nonprofit as those will usually have a pretty high CTR.
  14. Mike’s Tuesday Tips: These are just a couple of common mistakes I see when auditing Google Ads accounts. This is going to be strictly for search based text ads. I’m not going to get into shopping ads or display ads here. All of this would apply to BingAds campaigns as well. #1 Campaigns are Where You Control What You Spend One of the most common mistakes I see is in the way that an account is organized. Always remember that it is at the campaign level where you control the budget. If you are advertising multiple products or services, each one of them should have their own campaign. The reason for that is simple. Let’s say you are selling men’s clothing: shirts, pants, socks, and shoes. You would have each of those in their own campaign. In fact, you may even divide it up further. One campaign for dress shirts. One campaign for tshirts. One campaign for running shirts. And so on. Same thing for pants. A campaign for dress pants. A campaign for jeans. A campaign for sweatpants. A campaign for exercise pants. The mistake I see people frequently make is they make one campaign for shirts, and then divide things up at the ad group level. They create an ad group for dress shirts, another for tshirts, and another for running shirts. The problem with that organization is simple. What if your highest converting or most profitable shirts that you sell are dress shirts and you want to increase your ad spend on them? If you increase the money on your shirts campaign, it is going to be spent on all the ad groups. You have no ability to funnel the money towards a specific ad group. Depending on your inventory, you may even get more granular than what I suggested above. Instead of just dress shirts, you might want separate campaigns for short-sleeve dress shirts, long-sleeve dress shirts, dress shirts for boys, and campaigns for button-down collars and non-button-down collars. However you do it, always keep in mind that you control the budget at the campaign level and any products or services that you want to have complete control over the ad spend on need to have their own campaign. But Mike, I can just move products into their own campaign if I need to in the future. Sure, you can, but you lose the historical data (Quality Scores, conversion data, automated bidding data, ad performance, etc.). You are basically starting from scratch with a new campaign when you do that. #2 Keep Your Ad Groups Small The second most common mistake I see is ad groups with tons of keywords. Over the years, I have lost count of how many accounts I have seen where they just had one ad group in a campaign and all their keywords were thrown into it. Generally, your ad groups should not be more than 3-5 keywords each. I don’t go over 10 max. The reason for this is Quality Scores and Ad Rank. Your Quality Scores are impacted by several factors. The most common and important ones: Click-through rate of your ads. The relevance of each keyword to its ad group and to its ad text. The relevance and quality of your landing page. Historical Google Ads account performance. It is easier to control those relevance factors with tightly constructed ad groups and landing pages. The more broad your keywords get, the harder it is to make the ads and landing pages hyper-relevant. This is also good for conversions. It’s easier to convey a consistent message from search to ad to landing page this way. Today with the expanding definition of exact match search queries in Google Ads, it becomes even less necessary to have lots of different keywords in the same ad group. Why should you be worried about Quality Scores? Well, that’s the next one. #3 Not Worrying About Quality Scores This is a huge mistake costing advertisers a ton of money. I have often audited ad accounts and found lots of search terms with Quality Scores of 3 and less. The exact method for calculating Ad Rank is not shared publicly by Google, but Quality Score is a factor. In fact, they use a real-time Quality Score that does not get shared with you and takes into account things like proximity of the searcher, the time of day, nature of the search term, etc. What we get to see and work with is a general Quality Score. Why this matters is because it helps to determine how much you will pay per click. Higher QS’s get discounted positions in the advertising auctions. Lower QS’s have to pay more than other advertisers in order to show in the same position. Lower QS’s also means that oftentimes your ad just won’t be shown at all.
  15. This is going to be a long one. Let's talk about some best practices for anchor text in your link building. Some of this is probably going to go against what you may have heard from some of the "gurus" out there, but stick with me. First, I'm going to start with my own personal golden rule of anchor text. I have been following this for 12-15 years. If you get nothing else out of this post, but this one thing, you will be fine. Your anchor text should either describe why you are linking to a page or it should describe what the page you are linking to is about. That's it. It's as simple as that. This goes for both internal and external links. In this way, your anchor text is both SEO-optimized and serves web visitors well. Never intentionally use naked URLs as anchors. It is one of the absolute dumbest ideas I have ever heard. No, it is not "natural". Not one bit. I think we can all agree that some of the most "natural" links you will find are links within the content of an article. Nobody links with naked URLs in articles. That was not at all common until SEOs started doing it. If you were writing an article about how to build a PC, you wouldn't say something like: Looking at their selection and prices, I would not recommend shopping for PC components at hxxp://bestbuy.com. You would use something like: Looking at their selection and prices, I would not recommend shopping for "PC components at Best Buy." (Where the words in quotes are the anchor text) That is much more "natural" and makes way more sense. You do not need to use naked URL links to make your link profile look natural. I have been doing SEO since 2003 or 2004. In all of that time, I can tell you exactly how many times I have intentionally built a link with a naked URL as the anchor text. Zero. Exactly zero. The only time I have done it is when I have been forced to in things like press releases or directory listings that had no other option. I know many people will look at big brands and see that they have a lot of links with naked URLs as anchor text and believe that is proof that you need them, but what you have to remember about big brands is they do actual PR. A lot of that PR work includes things like press releases which only have an option for naked URL links. If you really believe that you need links with naked URL anchors and I cannot convince you otherwise, there are plenty of directories out there that only allow those types of links. Save them for those. Do not waste good link opportunities with naked URLs, and especially never do it with internal links. I did a consulting call yesterday with a business owner and their "SEO" who were looking for some extra help. Many of their internal links were using naked URLs for anchors. One of the dumbest things I have ever seen on a site. When I asked why, the SEO stated it was to make their link profile look natural. If I could have reached through the Zoom call and throat-punched the SEO, I would have. Should you worry about anchor text ratios? This might be an unpopular opinion, but I'm going to say no. Yes, I know you have probably heard how important anchor text ratios are, again, in making your link profile look "natural". I personally have never found any solid evidence of this. This notion all came about right after Penguin was released. I know many in this group were not involved in SEO back then, so a little history lesson then... When Penguin first released and many people saw websites tanking all over the place, mostly spammers, there was quite a bit of panic in the community. Nobody knew for sure what they had targeted (although the 750,000+ spam warnings that went out just a few months before in Search Console, Webmaster Tools at that time, gave us a pretty good hint). Within a few days there was an article published on a site declaring that they had figured it out and one of the main things Penguin was targeting was sites using high ratios of the same anchor text. If I remember correctly, it was a group called Microsite Masters, but don't quote me on that. And I don't think it was the same owners of that site today. This article spread like wildfire. People were so panicked and desperate for answers at that time, most never bothered to question it. What this article failed to account for was common practices of most spammers. Back then there were a lot more people using tools like Xrumer, SENuke, Bookmarking Demon, Sick Submitter, etc. to build massive amounts of low quality links. You could shoot out a few hundred thousand links in a matter of days with these tools if you knew what you were doing and had the right setup. What spammers would do is if they wanted to rank for "lowest rate mortgages" that is what they would use for the anchor text for pretty much all of their links. If not that, then very close variations. It was not uncommon to see pages ranked with spam that had 75-90% of the same anchors in their link profile. But was it really the anchor text that got them caught in the Penguin filter or was it the low quality links they were using? There were tons of sites hit by Penguin with much more varied anchors in their link profiles. I still maintain it was the low quality links Penguin picked up on. The anchor ratios were just a byproduct of how spammers did things back then. I tested it many times using more varied anchor text and Penguin would still catch the sites eventually. Let's think of it another way. What if you created the world's best online mortgage calculator and it went viral? What anchors do you think people are going to use to link back to that page? I would bet that 90% or more of the anchors are going to be either "online mortgage calculator" or "best online mortgage calculator". Is Google really going to punish a page for that? It would not make any sense. Now all of that being said (like I said at the beginning stick with me), I would recommend varying up your anchors. I have always done this (it goes back to my anchor text golden rule above), but not because of some worry about anchor text ratios getting my pages in trouble with Google. I do it because I'm never trying to rank a page for just a single keyword. There are usually quite a few variations I am targeting. The other reason I recommend varying up your anchors is that by doing so you are giving search engines more clues as to what the page is about. But I will never tell anyone that they should shoot for X% of naked URLs, X% of brand name anchors, etc. That just makes zero sense. In fact, doesn't shooting for some artificial percentages actually go against everything about your link profile looking "natural" in the first place? Again, if you get nothing out of this admittedly long rant, just remember my rule for anchor text. Your anchor text should either describe why you are linking to a page or it should describe what the page you are linking to is about. That's it. If you just do that, you will be fine.
  16. Mike's Tuesday Tips: This week, best practices for URL Structure First, make URLs descriptive and include keywords. Many CMS systems like Wordpress by default will spit out URLs that look like this…. hxxps://domainname.tld/?p=123 Or this hxxps://domainname.tld/2021/03/30/sample-post/ These are no help to search engines or to your visitors. Make them descriptive Like this hxxps://domainname.tld/on-page-seo/how-to-identify-core-web-vitals-on-your-webpages/ A visitor or search engine should be able to take one look at your URL and have a good idea of what the page is about. hxxps://domainname.tld/how-to-win-back-my-girlfriend-that-left-me-for-that-better-looking-guy-at-the-gym-who-is-in-amazing-shape-funny-and-is-highly-successful Okay, maybe not that descriptive… It’s over man. She’s not coming back. Second, don’t be afraid of using categories and subcategories. I know there is a belief out there that shorter URLs are somehow better. This belief came from nonsense correlation studies with zero actual evidence. Search engines have no problem processing longer URLs and you will find long URLs at the top of SERPs regularly. Categories and subcategories can work like breadcrumbs to search engines giving them even more clues as to what your page is about. As you can see in the images, search engines will display them in search results giving potential visitors even more reason to click on your listing. Lastly, use hyphens to separate words. Do not use underscores. Underscores actually join words together. Do not leave it up to search engines to determine what your page is supposed to be about. hxxp://domainname.tld/alanismorissetteshits Is that Alanis Morissette’s hits or Alanis Morissette… something else. One is a long list of award-winning music. The other… is not.
  17. Today isn't so much a tip as it is sharing a little known tool that is one of my absolute favorite time-saving tools. I have no idea how more people do not know about this thing. Unfortunately, this tool is set up to only work in the U.S., so for those of you outside the U.S., I apologize. Maybe someone will create one that works internationally (*cough* Semrush *cough*). This tool is great for anyone doing local SEO, Google Ads, or Bing Ads campaigns. You can probably find other uses for it too. http://5minutesite.com/local_keywords.php Have you ever had a big list of keywords you generated and then you needed to add all the local towns or cities to that list? First, you have to find all of the towns and cities. Then you either have to do some copy and paste work for each town/city and your list of keywords or use something like Excel to attach it to them. Or worse... you spend an hour typing all this crap out. The local keyword tool at 5 Minute Site does all of that for you in a flash. Simply enter in the zip code you want to start with, select a radius around that zip code in miles, enter in all your keywords, choose if you want just cities, cities + state abbreviations, cities + state names, zip cods (outdated, but people used to search that way), do you want your keywords followed by the location and the locations followed by the keyword (ex: italian restaurants gettysburg and gettysburg italian restaurants), and then hit the submit button. Some people will say that they do something like this in Excel already, but the difference is that if you are using Excel, you have to find all the towns yourself. This will swipe all the local towns in that radius and add them to each keyword for you. A huge timesaver both in finding the towns but also in generating the list. You can use this to build lists to feed into your favorite keyword tool (Semrush) to get search volumes or for ad campaigns. There is also some other functionality to this tool. At the bottom you can select to have all the keywords wrapped in quotes or brackets if you are going to be feeding them into PPC campaigns as phrase or exact match options. You can use this to quickly add brackets to an existing list of terms to make them formatted as exact match by setting the radius to zero, using the prefix and suffix fields, and entering [ and ] respectively to those. You could do the same thing with quotes for phrase match.
  18. I get asked a lot about what books or courses I recommend for someone to learn more about SEO. The best thing I can recommend is to investigate the SERPs yourself. Find high-ranking pages and sites and tear them apart to see what they are doing. The answers to most SEO questions are already in the SERPS. I'm all for testing things, but if you don't have the resources to do so, Google is already giving you the answers in the search results. This was an idea I learned a long time ago doing just that and taking a look at how Wikipedia structures its pages. You can learn a lot about SEO by reverse engineering Wikipedia. Yes, they are a highly authoritative site, and yes, they get a ton of backlinks. However, what really takes them to the next level is what they do on their pages and how they structure their site. As I mentioned in the thread last week, link placement on a page matters. There are a bunch of factors that determine the strength of a link, but a general rule of thumb is that link strength flows down the page like you are reading a book... left-to-right and top-to-bottom. Links at the top of a page are going to be stronger than links at the bottom of a page. You can use CSS to take advantage of this, and that is exactly what Wikipedia does. Wikipedia is a unique site where its main navigation really is not all that useful for visitors, nor does it provide any real SEO value. Take a look at any Wikipedia page. https://en.wikipedia.org/wiki/Airplane! You see their main navigation along the left-hand side as well as some links at the very top of the page for things like logging in or viewing the revision history of the page. These links are pretty much useless for SEO. So what does Wikipedia do? Take a look at the text-only version of the page in the Google cache. The text-only version is how search engine spiders are really reading the page. http://webcache.googleusercontent.com/search?q=cache:https://en.wikipedia.org/wiki/Airplane!&strip=1&vwsrc=0 You will notice that they use CSS to layout those links where they want them, but in the code of the page and text-only version, they appear at the end of the page, nowhere near the top. In this way, they put a stronger emphasis on the links that matter on their pages, the internal links in the content. You can use CSS to do the same thing on your sites. Think about all the links on a site that are in the header for the user experience, but have no SEO benefit. Contact Us, Logins, click or tap-to-call phone number links, sometimes even ToS or Privacy page links... You can put these at the end of the page in the code, but then use CSS styling to position them wherever you want for the user, getting the best of both worlds.
  19. A few months ago I was asked in to share a tip weekly in the Semrush Facebook group. Another Facebook group liked it and asked me to do the same there. Well, I figured I might as well start posting them here too, so here is the first one: An SEO myth that drives me absolutely crazy: “The more results that Google shows in its search index for a keyword phrase, the more difficult it is to rank for that keyword.” As Michael Scott would say, “NOO GOD! NO. GOD. PLEASE. NO. NO!!! NO!!! NOOOOOO!!!” It does not matter if there are 100,000 results in the Google index or 100,000,000. It tells you absolutely nothing about how difficult it will be to rank for that keyword. Let me tell you why. You are not competing with 100,000 web pages or 100 million web pages. No matter what the search term is, you are competing with the top 10 ranking pages. Really the top 3 because that is where most of the traffic goes. If you can beat the page ranked #3, then #4 through 100 million do not matter. Think of it like running a marathon. If you are fast enough to finish 3rd in the race, it does not matter if you are running a local marathon with 1,000 runners or if you are running The Marine Corps Marathon in D.C. with over 30,000 participants. You are still going to finish 3rd. It also does not tell you how many pages are "trying to rank for that term" as many people state it does. Look at the screenshot I included about the elephants. Do you really think that 3.8 million pages are trying to rank for that?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.