Jump to content

Search the Community

Showing results for tags 'mikes tuesday tips'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Welcome
    • New Members Start Here
  • Marketing and Business Forums
    • Main Internet Marketing Discussion
    • Search Engine Optimization
    • Paid Traffic and Advertising
    • Social Media Marketing
    • Writers Guild
    • Sales / Offline Marketing
    • Investments
    • Product and Services Reviews
  • Clicked Marketplace
    • Clicked Members Wanted
  • For Developers
    • Programming
    • Web Design
  • Clicked Inner Circle (Private)
    • Inner Circle News
    • SEO Chat
    • Making Money Online
    • Business Corner
    • Case Studies
  • Clicked Training Courses (Private)
    • SEO Training Course
    • Private Network Course
  • Clicked Support Forum
    • Clicked News
    • Clicked Suggestion And Help Forum
  • Off Topic
    • The Bar

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start





Website URL






  1. If you are not using Google Tag Manager on your websites, you should be. Google Tag Manager is a free tool that allows you to manage and deploy marketing and tracking tags on your website without the need to modify the code on your website. It’s a one stop shop for deploying Google Analytics, Facebook Pixels, creating new events, tracking form submissions, and a host of other features. A few of the benefits of using Google Tag Manager include: -No need for a developer. Have you ever wanted to add tracking for a new form deployed on a site or want to add a scroll event to a new piece of content to see how far down visitors are reading it? You let your developer know only to be met with a, “I will get that done next week,” response. Of course, that usually means “I will have to follow up with you next week to find out why you did not do this yet.” No more. Once Tag Manager is installed on a site, you can easily do all of this yourself in a few minutes. -No need to code. If you are a DIY’er, the nice thing about Google Tag Manager is you really do not need to know how to code, although for some things a bit of familiarity with javascript can certainly be helpful. Tag Manager comes with a bunch of preset tags for adding things like Google Analytics, Google Ads Conversion Tracking, Google Ads Remarketing, Google Optimize, HotJar, LinkedIn Insight, and a host of other integrations. The only thing I commonly use that is not already ready to go in Tag Manager is the Facebook Pixel, but there is an option for adding custom code where you just copy and paste your Pixel code into the tag. -Easy tracking. You can easily set up tags to track button clicks (all buttons or specific buttons), link clicks, form submissions, scrolling events, PDF downloads, and you can even install schema. Note: Google recommends not using GTM for schema, but it does work. They have honestly never given a great reason for why they do not recommend it. Just be aware it will not work in the schema testing tool, but you can copy and paste your schema code directly into there to test that it works. -Test new products without waiting for a developer. Want to test a new product or service on your site but need your developer to install the code? Not anymore. You can just insert it into Google Tag Manager. -All third-party code is in one place. Need to make a change to one of your tracking scripts? There is no more hunting down the code in your website. Everything is in GTM and easy to find. -Preview and debug mode. This is maybe one of my favorite features of GTM. It has its own preview and debug mode where you can test things before making it live on your site. It will show you what tags are firing and which are not. You can quickly and easily get things running correctly. ***What are tags?*** For those of you not familiar with Google Tag Manager, tags are snippets of code you will be inserting into your site with GTM. Most commonly it is things like tracking pixels. Tags tell GTM what to do. Tags can have multiple triggers and variables. Triggers are what tell GTM when to do what you want it to do. Common triggers in GTM include pageviews, clicks, form submissions, and custom events. Variables can put limitations on your triggers and tags. For example, you might create a tag to fire Universal Analytics tracking, the trigger would be a pageview (you can set this to all pages), and the variable would be your Analytics tracking ID number. Another example, you might create a Universal Analytics event as a tag, the trigger might be a form submission, and it might have a variable identifying a specific form. This is where you can start to get a little more refined in your tracking. Let's say you have a request a quote form on your homepage, but you also have one on a specific request a quote page. You can set up the same tag as mentioned above, but add a variable to the form submission trigger where it only fires on the homepage. Then set up a separate identical tag with a variable to only fire on the request a quote page. Now you can set up separate goals for the events these tags will create and track where different traffic segments are converting. There is a lot more you can do with Google Tag Manager. This is just scratching the surface of it. If you are just getting started with it, I highly, highly recommend the MeasureSchool YouTube channel. You will not find better tutorials on GTM anywhere.
  2. Mike’s Tuesday Tips I do a lot of site audits. There are a ton of things I look for in my audits, but there are 3 things I always check for right away and get developers working on. I prioritize these 3 things because they are usually easy to fix or change and they can result in quick wins. The first thing I am looking for is wasted links or links out of position. These are links that do not serve any SEO benefit and are not particularly useful for the user experience either. I am particularly interested in links that are part of the site template. In other words, they generally appear across the entire site on every page. Headers and footers are the most common place you find these. Without going into a whole essay on all the factors that determine the strength of a link, a simple rule of thumb is that the more links there are on a page, the weaker each link is. Also, links at the top of a page tend to carry more weight than links at the bottom of a page. A page does not pass on an equal amount of link equity through each link on the page, but the amount of link equity that a page can pass on is divided up among the existing links. If you can cut down the links, you can strengthen the remaining ones. The first thing I want to eliminate is what I consider to be wasted links. The most common example I come across of these are links that appear in both the navigation and the footer. An argument can be made that these are better for the user experience, but are they really? Among web visitors today, I think everyone knows they will find key links in the header. I don’t think anyone runs to the footer to look for something. A lot of sites use sticky headers. In these cases especially, you cannot really try to argue that duplicating links in the header and footer help the user experience. If you have a business owner or developer that argues for keeping these links or you yourself are not sure of the impact of removing these links from the footer, set up some tracking. Track what percentage of visitors scroll to the bottom of your pages. Track who clicks on the links in the footer. Now, I know some people will argue that sites like Semrush have links that appear in both the header and footer. Well guess what? When you build a brand that is as well-known as Semrush, you can do whatever you want. Next, I want to look for links that are out of position. I mentioned above that links at the top of a page are generally stronger than links at the bottom of a page. I prioritize link placement based on that. The most common example I see of links that are what I call “out of position” are links in the header to privacy or ToS pages. Those pages are not a priority, so stuff them in the footer. In some niches, I would argue that links to social media accounts are also wasted in the header and better served just in the footer. Based on the industry, I might consider doing the same for About pages. In some industries, it is vitally important to build trust with web visitors in order to get a conversion, such as attorneys, financial advisors, and counselors. In other industries, it may not be as vital. Nobody is visiting Amazon or Best Buy’s about pages to decide if they want to buy something from them or not. If you are unsure, again set up some tracking to see how many people are clicking on the About page from the navigation. Data does not lie. The second thing I look for is link depth on primary / important pages. I want to know how many clicks away from the home page these pages are. Depending on the size of the site, they should never be more than 1 or 2 clicks away from the home page. If they are, you need to figure out a way to add links where appropriate to rectify it. I might make some exceptions to this on a very, very large site, but generally, I try to keep important pages no more than 2 clicks from the home page. The reasoning behind this is that in almost all cases, the home page is the strongest page on a site. The more clicks away a page is, the less link equity it is feeding off of the home page. This is a little more advanced and takes some additional work, but for the pages that are 2 clicks away (or more), I’ll also take a look at the link path. If the intermediary page has 100 outbound links on it, but there is another page that could also make sense to use with only 20 outbound links, I’ll consider linking through that page instead (or in addition to). Again, this is to strengthen the link equity feeding into that target page. The third thing I’m looking at for some simple adjustments and quick wins is title tags. I know people like to try to write catchy title tags to improve click through rates. To me, the best way to improve your click through rate is to rank higher. You will almost never see a page ranking #2 or #3 getting a better click through rate than the page ranking #1. I focus on two things for title tags. I want primary keywords near the beginning of the tag, not at the end, and I want the title tag to address the primary search intent of the page. What was the page created to answer? Following those two guidelines will typically give you title tags that search engines like and that will also get solid click through rates.
  3. Artificial Intelligence is on the cusp of a major breakthrough. Artificial intelligence has been around for awhile now, but it has always seemed stuck in one place: simulations. Every time we think that artificial intelligence will finally break through and do something useful, it ends up being just another simulation game or robot vacuum cleaner. But recent advances in machine learning have led to a new kind of artificial intelligence called deep learning. Deep Learning is breaking all sorts of boundaries when it comes to AI and may eventually lead us to true general-purpose machine intelligence—the holy grail of computer science since the 1950s! So what does this mean for content writers? Well, if you’ve ever written any sort of content before, then you know how difficult it can be to find the right words. Maybe you’re writing a blog post, and all you have is an idea. What do I write now? How do I find the right words? This is a very difficult problem that’s plagued writers for decades. Deep learning and neural networks have technology that can help solve this conundrum. Neural networks are computer systems that can learn from examples just like we do. Scientists have found ways to create “deeper” neural networks with more processing power by building them out of multiple layers. The first layer is where the data goes in, and then each successive layer does something different with the data, until finally you end up with an output—sort of like a “function,” but you pass it the data and get an output. The best part is that this whole process can be automated! You just feed a neural network thousands upon thousands of data sets and it learns what to do on its own. That means as soon as your data is in there, you can start getting results back. If you want to see some great examples of neural networks in action, check out what Google’s DeepMind has been up to. As for AI researchers, their biggest obstacle now is figuring out how to give a neural network the ability to learn on its own and do something different from other very similar data sets. The “general intelligence” that we are all hoping for has always eluded researchers, and it’s not clear how close we are to achieving something like that yet. Still, the progress in deep learning is promising and getting all sorts of people excited. But how exactly will this affect content writers? Basically, any time you have a series of steps that are predictable, a neural network can be used to figure out what should come next. For instance, if you’re writing some sort of science article that has a formula for calculating the acceleration of an object based on its mass and velocity, you can pass your formula to the neural network and it can figure out how to restate the equation in English that’s easy for a reader to understand. And all the best content writers will have to be ready for this new technology by developing their own deep learning systems for helping them write. But even if you end up not writing with machines, they’ve already become a part of your world and we need to learn how to interact with them. The entire field of machine learning and AI is growing incredibly quickly, and this may be the one to get excited about. Content writers will likely play an important part in all of this, whether it’s by training machines how to write or being the ones who try to understand what the machines are already writing. As long as there’s content worth reading around you, there will be a place for an intelligent person like you!  ****** I did not write one word of what you see above other than the title, and it took 54 seconds to generate that. That is 626 words. I will not argue that it is perfect by any means or even that it is outstanding content. However, if you don't see how taking something like that and either fleshing it out yourself or giving it to a writer to do so can save you a ton of time and/or money, you should maybe wake up and start paying attention to what is going on with AI and content generation. I should also mention that I gave the program I used no other instructions other than the title. I could get better results with a little more effort and putting up a few more guide rails for it to follow.
  4. This one is going to be quick and simple. This is another one of those myths that drive me nuts. You will sometimes see it recommended that you should wait to start link building to a new site for several months. Building links right away would look “suspicious” to Google or whatever nonsense people want to say about it. In 15+ years of doing this now, I have never waited a single day to start link building activities on a new site. Google has no idea what kind of offline, and in some cases online, promotion a website owner has done for their site. Maybe I have an email list of 50,000 subscribers I marketed it to. Maybe I have been running TV and radio ads 24/7. Perhaps I sent out 10,000 mailers in my area to promote the site. Maybe it’s all over a private Facebook group or forum. The point is that just because a site is new does not mean it doesn’t have the legitimate potential to earn links. Also, why wouldn’t a new site owner not be promoting their website? If you opened a new restaurant in town, I’m sure you would put a sign out front on day 1 (even before you opened for business). You would be running ads online and offline to get the word out. Links are no different, and there is absolutely no reason to let a site sit before you start promoting it. I think that SEOs, and many marketers in general, have this idea burned in their head that Google is out to get everyone. Google spends way more resources looking for reasons TO rank a website than looking for reasons NOT to rank a website.
  5. One of the most useful, yet underutilized, tools available to SEOs is Google Search Console. In talking to SEOs and business owners, a lot of them set it up, submit their sitemaps, and only check it when they get a message from Google about a coverage issue or some other error. There is a ton of data you can gather inside Search Console if you spend some time with it. Here is a simple way I like to use Search Console to uncover additional opportunities. SEARCH TERM / PAGE AUDITS I often will look at Performance >> Search Results >> Queries and add a filter at the top for a specific URL. This lets me see the keywords that page has received impressions and clicks for during the selected timeframe. I’ll add in average rankings and export this to a spreadsheet. Sometimes you will be amazed at how many different search terms a given page is ranking for. I do this periodically on client sites for main hub pages and whenever I do site audits. Generally, I’m looking for 2 things: search terms I do not want to target on this page because there is a better page on the site for those terms, and I am looking for search term opportunities. Once in a spreadsheet, you can easily sort this data however you want. I like to look for search queries with high impressions, low clicks, and a top 20 average ranking. If the search intent matches, these are terms we can optimize for to bring in more traffic, and since they are already in the top 20, they can often be quick wins. You may also find search terms that fit this criteria but are really better served on a new page or another existing page. Want to narrow things down further and only include queries with a certain word or phrase in it? You can. Just add a query filter in addition to the URL filter at the top.
  6. One of the key steps of any good keyword research that often gets overlooked is to actually spend some time in the SERPs. You can use all the tools you want to research search volumes, competition, etc., but none of them will give you a great picture of the search intent for a keyword or what is really going on in the SERP. They also won’t tell you what kind of results Google is showing in the SERPs. Without looking at the actual SERPs and the results on page one, it’s tough to get a good picture of the actual problems users are trying to seek out an answer to when searching for a particular query. Whenever you pick out a new keyword to target or decide to go back to an old piece of content to refresh it, take a look at what kind of results Google is showing. You want to pay attention to a few things. ***Look for Google featured snippets. If Google is displaying a featured snippet at the top of the search results, you want to take a close look at the query and the featured snippet. Does it fully answer the query? If it does, chances are the search results below, even the first organic position, are not getting very much traffic. Keep this in mind. If you do not replace that featured snippet, the keyword is probably not going to bring much traffic. A simple example of what I’m talking about would be a search such as “How tall is Mount Everest”... Google immediately shows the answer of 29,032 feet. This is a simplistic example, but there are plenty of other queries to think about like this. Someone might search for something like “What size X should I use for Y”. If the answer to that is simple and doesn’t require any additional explanation, a featured snippet is going to solve it. Very few results below are going to get clicks. ***You also want to pay attention to what type of content is ranking. Is it product listings? Tutorials? Lists? Service pages? This will give you a good indication of what Google has decided is the most common search intent for the query and what it is favoring for search results. ***What kind of sites are ranking? Is it mostly news sites? Local businesses? Directories? Blog pages? This can also give you some clues into what Google has determined to be the search intent. For example, if you look at a search result and see a lot of directories, that can be an indicator that the search intent is to find a list of places that can offer a solution. This is common for a lot of local searches. ***How far do you have to scroll before you see the first organic search result? This is often the answer when people are confused as to why their high ranking page is not bringing them more traffic. If you see a featured snippet, followed by ads, followed by a ‘people also ask’ box, and then the organic results, keep that in mind. Even if you rank #1 in the organic results, you are probably not going to get the usual 45-70% of clicks you could expect from a #1 ranking. All of those other things are going to siphon away clicks. ***Do this on both a desktop and mobile device. Today we are obsessed with the user experience on our websites, but few of us take the time to check out the user experience of the SERPs. Try to look at a SERP without bias. What are the common themes you see in the search results? How could your page better serve searchers? Again, forget about your own bias and what you think might be best. How can your page better fit into what Google thinks is best? ***Lastly, this process can be useful when trying to identify ranking drops. Sometimes the user intent for a search query changes over time, or more precisely, based on new data Google’s interpretation of it changes. If you see traffic drops, it can be because suddenly Google is showing a featured snippet when they weren’t before. Maybe a ‘people also ask’ box popped up in the search results. If you see ranking drops, inspect the SERPs compared to what they looked like previously. Maybe over time the search results have shifted from showing tutorials to showing more products and services, or the search intent has shifted in another way. Ranking and traffic drops are not always because you did something wrong or just because competitors overtook you. Sometimes they are because of new ‘features’ showing up in the SERPs ahead of you or because the search intent is shifting. Some time in the SERPs can make your job easier.
  7. I am often asked about what tools I use for SEO. There are a lot and some of them are situation based, but this is a quick list of the tools I use frequently on basically every project. Semrush - Obviously. This is one of only two tools that I log into every single day. I use this for assisting with site audits, competitor organic and paid research, keyword research, and content generation/marketing. Next, let’s get the Google tools out of the way: Google Analytics Google Search Console Google Tag Manager Google Optimize - If you are not familiar with this one, I use it primarily for A/B conversion testing. Google Data Studio - I just want to mention that this is one I think is underutilized and underappreciated. With Data Studio you can pull data from Analytics, Search Console, and Google Ads and combine and present it how you want. You can also much more easily segment the data. If you are working with clients, you can make all that data more appealing to the eye and emphasize what you want them to focus on as well (like leads, sales, etc.). Google Chrome Developer Tools - I have shown in other tips how I use this for identifying things like LCP and CLS issues. That is just scratching the surface of what you can do in it. Auditing / Monitoring Tools: Screaming Frog - I know a lot of people like SiteBulb instead. They do pretty much the same thing, but Screaming Frog does it faster and is less resource intensive. ContentKing App - This has recently replaced DeepCrawl for me. It has great monitoring features, but also lets me easily dig through and segment pages. Too much to cover here, but a great tool. Other tools: Jarvis.AI - There are plenty of AI writing tools out there and it seems like new ones popping up each day now. I prefer this one. I use it to boost our content generation. I do not create content in Jarvis and post it straight to websites. Everything gets edited first, but it really speeds things up versus writing from scratch. I do not use it just for writing articles. I use it for helping to generate ad copy and headlines. I use it to help generate title tags and H tags in articles. I don’t often use something it generates for these, but I will have it write 20-25 variations of a title tag, and use those to create something myself. I’m constantly finding new ways to use this. Frase - Love this for generating content briefs for writers and picking apart data, headings, questions, etc. used in top ranking sites for queries. Great tool for writing content. It's AI writer is not great right now, but could improve over time. Answer the Public - Great tool for generating additional ideas for content and keyword research. SEOPress - Because I know someone is going to ask what Wordpress SEO plugin I prefer. I’m not a fan of Yoast and anyone using it is doing so at their own peril, or the peril of their clients. Yoast has a long history of releasing updates with bugs in it, sometimes site and SEO crippling ones. It’s not just the bugs that bother me though, it is the way the company treats its customers and reacts to those issues. SEOPress basically does everything Yoast does, but is less bloated and has never F’d up one of my sites. (On a side note relating to that, except for security patches, when it comes to Wordpress you should never be updating Wordpress versions or plugins when updates are first released. Even in staging areas. Bugs can creep up that you do not catch in staging. Let everyone else be the guinea pigs for a few weeks.) Monday - Although not directly an SEO tool, this is the other tool I log into every single day. I often get asked how I keep on top of everything and manage clients without anything falling through the cracks. This is how. There are plenty of similar tools out there, and even if you are working solo, I would recommend using one. ClickUp and Asana would be my next choices. Octopus.do - Something I have been using a little bit more recently on smaller sites. It helps to visualize the site structure changes I want to make with clients. There are plenty of other great tools out there, but these are the ones I use most frequently.
  8. This is a short one (cue the 'That's what she said' jokes) Condense common resources. I was auditing at a site once talking about addiction and addiction rehab. I found 10 posts that mentioned the same 4 addiction rehab/help sites and linked to all 4. Something like… "If you or someone you know suffers from addiction or you think might be suffering from addiction, check out the xyzaddictioncenter.com. There is also an addiction hotline at addictionhotline.com…." And so on. They linked to the same 4 places with the same kind of info. The problem with that is every time you do it, you are passing on link equity from those pages to an external site. Instead of doing that, they could have just created a resource type page and directed people there if they are seeking more information and help. Now instead of having 4 external links on 10 different pages, there are 4 external links on 1 page. Each of those pages will have 1 link pointing to that resource page. If you find that your site is linking to a lot of the same places multiple times on a variety of pages, consider creating some kind of resource or FAQ type page that you can direct those links to instead and cut down on your total links on some of your pages.
  9. Spend less of your time, energy, and resources worrying about what Google may or may not do to you. I get messages all the time from people worrying about some what-if scenario, usually in regards to worry about invoking some Google penalty for something. If you are an SEO or a business owner that does your own SEO, I want you to write this down. I'm serious. Write it down. *** Google spends way more resources looking for reasons TO rank your website than looking for reasons NOT to rank your website. *** Many SEOs tend to spend too much time worrying that Google is out to get them or is just dying to slap them with some sort of a penalty. If you have good content that answers a particular search query, Google wants to show your page. Never forget that.  Your goal should be to provide them with reasons to show your page.
  10. Ever wonder if a red “buy now” button will convert better on your page than a yellow one? Are you curious if a different call to action would drive more people into your funnel than the one you are using? I think most people understand the value of A/B testing to improve conversions, but I’m always amazed at how few people actually do it. I think some people think it is too hard to implement. Others think it will be expensive to get their developer to set it up. Few people realize that Google actually gives you a completely free tool to set up A/B testing that is simple to use. It’s called Google Optimize, and it has been right in your face all this time. When you are in Analytics (or Tag Manager and Data Studio) and hit the drop down menu to switch accounts, you will see icons for Analytics, Tag Manager, Optimize, and Data Studio. There is also an icon for Google Surveys, but I never use that. Just like Google Analytics, Tag Manager, and Data Studio, Google Optimize is 100% free. The interface is very similar to Tag Manager. There is a script that needs to be installed in the header of any page you want it to run on, similar to Analytics and Tag Manager. GTM has a tag already set up for Google Optimize that you can use. When you create your first “Experience” (as Optimize calls them) you will be given a few options to choose from. The most commonly used is the A/B test, but you can also do a Multivariate test, Redirect test, Personalization, or Banner template. The Banner template lets you add a notification to the top of your site. The Personalization allows you to target certain visitors. Want to show something to Facebook visitors that organic search visitors do not see? This will allow you to do that. To set up your first A/B test, you will need to select the page you want to run your test on, connect your Google Analytics account to Optimize, and install a Chrome extension. The extension allows you to use an editor on your page to identify what you want to change for your testing. Add your first variant, hit the Edit button and you are off and running. This will take you to the page editor. It works very similar to a lot of the common page builders you will find out there on Wordpress or other platforms. You can select an element and change its size, color, background, font, etc. Pretty much anything as far as the styling goes, you can change. You can move segments around the page. You can also change content. Want to try a different heading? Instead of using “Get a Quote” on a button, do you want to test “Get Started”? You can change the words on your page however you want. Once you are done, hit “Save” and then “Done”. That takes you back into Google Optimize. You will see you have an option to change the weightings of when your experiment runs. Most of the time, you would use a 50/50 test, but let’s say you have a page that converts really well and makes a significant portion of your income. You may not want to jeopardize that, so you could have 75% of your visitors see the original page and only 25% see the test version. You have options for page targeting where you can run the test on more than one page. Next you will need to connect the test to a goal in Google Analytics. Going even deeper, there are audience options you can choose. This includes simple things like targeting people by device, new vs. returning visitors, using a site visitors geography, browsers, etc. You can also get a little more advanced and only run the experiment on visitors that come to the page by a UTM parameter. Want to target people coming from 1 particular Google Ads group inside a campaign, but not run the same experiment on other ad groups inside the same campaign that use the same landing page? Use UTM parameters. Or maybe you want to run a test on people who visit your homepage from Google My Business versus organic searches. Again, use UTM parameters to do it. Google Optimize gives you a really easy way to run A/B tests on your page and improve your conversions. Once you see how simple it is, it can become kind of addicting running tests all the time.
  11. Today isn't so much a tip as something to think about in terms of where you are devoting your time and resources. This is probably an unpopular opinion that will probably go over like a lead balloon but here goes. I think SEOs are full of crap when it comes to pagespeed and conversion rates. I noticed something recently. I'm in a lot of groups and forums around the web dedicated strictly to conversion rate optimization. That's all they talk about. You know what never gets mentioned when someone asks for help in improving their conversion rates? Pagespeed. I do not mean rarely. I mean pretty much never. All these industry experts on conversion rates almost never bring up pagespeed. And when they do, they usually just say something like, "Keep your pages loading in under 4 seconds..." They never quote that, "For every .5 seconds you decrease your pagespeed you will see an increase in conversions by XXX%," nonsense you see mentioned all over SEO groups and forums. I thought maybe it was just my own confirmation bias, so I just did a search for tips for improving conversion rates. Took a look at the top 10 results. Only 3 of them mentioned page speed. One was Kinsta, which I hardly see as a expert on the topic. Kinsta is also probably hugely biased. They want you to believe pagespeed is vital to everything and that they offer fast hosting. One was Hubspot, which listed it as 14th among 19 tips. The other was VMO which is a service for A/B testing, so conversion rates are kind of in their wheelhouse. They listed it 10th out of 11 tips. Hardly a ringing endorsement for the importance of pagespeed in conversion rates. To me it seems it is only SEOs that believe it is so vital, and I think that is largely because they are looking to cling to anything to justify all the hours many of them are billing to clients to speed up pages. They certainly aren't seeing any benefit in the SERPs. I'm not saying it is bad to decrease load times on your site. It certainly is not going to have a negative impact on your conversions. I just think its impact on conversion rates is grossly exaggerated. Also, before someone says it, I'm not talking about extreme examples. Yes, if your pages are currently loading in 20 seconds and you decrease that to 3 seconds or less, you will see a dramatic difference in all user metrics. I'm not debating that at all. I'm more talking about the cases of those desperately trying to trim off an extra 0.2 seconds off of pages or push that PSI score from a 93 to a 99 or 100.
  12. Mike’s Tuesday Tips: This one may seem basic to some of you, but I think these are two tools that are under-utilized in the SEO community and would allow people to answer a lot of their basic questions themselves. They are the Google cache and text-only version of webpages. I use both pretty frequently when doing audits and trying to diagnose problems with webpages. How do you access them? It’s pretty simple. If a page has been indexed and cached by Google, you simply type cache: in front of the URL in your browser (assuming Google is your default search engine) or in the search results you will see an option for it. There used to just be a link that said, “cached”. Now you have to hit the 3 dots beside the search result title and you will see a button labeled “Cached”. Click that and it takes you to the Google cache version of the page. Once you are looking at the cached version of a page, you can get to the text-only version of it by clicking the link at the top of the page. The cached version is good to make sure that everything on your page is visible to Google and the text-only version is good for seeing how things really lay out to search engine spiders before CSS rearranges the page. What kind of things can you identify and find out about a page with these two options? Well, how many times have you seen someone ask if Google can see text hidden behind a “read more” button or in accordion tabs? Go look at the cache and/or text-only version of the page or a page using similar code. Right there is your answer. Want to know if Google is indexing comments on your webpages? Look at the text-only version. If you are using Disqus, you’ll see the comments do not appear. Remember a tip I gave a while back about how Wikipedia uses CSS to show their menu prominently while it is really buried at the bottom of the page code? Looking at the text-only version of pages is how I first discovered they were doing that. Want to see if there are hidden links on a page or other hidden elements? They can be hidden in the browser version, but they cannot hide them in the text-only version of the page. It’s how I discovered that Elementor’s menu widget is trash. If you use Elementor’s menu widget, rather than using CSS and/or JavaScript to properly display your menu between desktop and mobile devices, it creates two menus. One for desktop and one for mobile. On desktop devices it hides the mobile one and on mobile devices it hides the desktop version. I’m including screenshots of a site built in Elementor (and featured on their site) called The Perfect Loaf to illustrate what I mean. You can see how the menu looks for desktop and mobile devices. Then look at the text-only version. The menu is duplicated. In this case, that is 15 extra links on every page, weakening their internal link structure. Now imagine a site with 40-50 links in their navigation. You can use the cache and text-only version of pages to answer a lot of questions about how search engines view webpages.
  13. Mike’s Tuesday Tips: Last week was about identifying CLS and I mentioned how to identify LCP (Largest Contentful Paint) on your pages. This week, we are going to identify common issues that cause a slow LCP. LCP is probably the simplest of the Core Web Vitals to deal with. Having addressed this on a few hundred pages over the past year now, I can tell you that the most common cause of a slow LCP loading time is a slow server response. If you are on a shared hosting environment, especially ones that have been oversold (and often overhyped) - I’m looking at you Siteground - you can tweak things all you want, but there is only going to be so much speed you can squeeze out of the server. Want to figure out if your server has been oversold? Take a look at how many domains are hosted on your same server using this tool: https://viewdns.info/reverseip/ There are tons of other tools out there like this. If you really want to investigate, you can run all the sites through something like Semrush and take a look at their traffic estimates. You may have one or two sites on the same server getting tons of traffic and hogging up a huge amount of resources. If you really care about pagespeed, one of the best things you can do is get away from shared web hosting. Before someone comments about how they got good scores with shared hosting on Core Web Vitals, Pagespeed Insights, GTMetrix, or any other speed test you want to mention…. Sure, I believe you. However, think of just how much better your pages would load if you moved to a decent VPS or dedicated hosting solution. There is an example below for a new client I just started working on. This site is hosted on WPEngine, which is supposed to be one of the better hosts out there. We are getting server response times ranging from 1.5-2.2 seconds. Just fixing that alone without doing any other tweaks, will bring their LCP score in line with Google’s standards. Before we go crucifying WPEngine, there is also a possibility that the problem is on the development side of the site design. There might be some processes being called that have to complete server side before anything loads. With a lot of dynamic content, that can happen sometimes. A database cleanup may also solve some of that response time. If you do not want to switch hosts, another solution that may work is to use a content delivery network (CDN). Your mileage and experience may vary with these. For example, I have had some sites I have put on Cloudflare and saw drastic improvements. I have used it for other sites and it has actually slowed them down. The second big issue causing slow LCP load times is render blocking JavaScript and CSS. A browser will pause parsing HTML when it encounters external stylesheets and synchronous JavaScript tags. To speed up the loading of your LCP, you want to defer any non-critical JavaScript and CSS files. You should also minify and compress your CSS and JavaScript files. For any styles that are critical to your LCP and/or above the fold content, you can inline them, which means you place the style elements directly in the <head> of the page. The next thing I see frequently slowing down LCP times are images and/or videos. Make sure that you optimize and compress your images. Browsers load full images before adjusting them to the proper size to be viewed. If you are using an image that is 2000 x 1330 pixels, but it is only viewed at 600 x 400 in your page design, the browser is going to load that full 2000 x 1330 sized image. Before you bother with compressing anything, make sure you are using appropriate image sizes. Resize the image and then upload it back to your server. You can also lower the image quality. Load it up in Photoshop or something like GIMP and change the resolution by adjusting the pixels/inch. Many times you will find that a lower resolution still looks great on your web page, and it will be a much smaller file. One little trick I sometimes will use if I notice that changing resolutions on an image causes the quality to take a noticeable dip is I will make it a part of the design. I will toss a dark overlay over it and/or make it slightly blurry. I’ll do this if the image is being used as the background for a section. It helps the text over it to pop out anyhow. If you are using Wordpress, there are a lot of plugins and options out there for compressing images. There are also options for serving next-gen image options, mainly WebP images. WebP images are not supported in all browsers, so make sure you have JPG or some other format as backups to display. If you are using a video or slideshow, stop it. They are not a great user experience on mobile devices. Lastly, use a tool like GTMetrix to investigate the loading order of elements on your page. I hate GTMetrix scores and the fact that they default to desktop loading. GTMetrix is pretty useless for everything other than its waterfall display. There are other tools that have waterfall displays, but I find GTMetrix the easiest to work with. Take a look at what is loading before your LCP element. Are those things necessary? Is there anything that can be deferred or just eliminated? I’ve shaved significant times off of LCP scores just by getting rid of Google Fonts. Google Fonts are great, but they have to load from Google’s servers. Then if you use different font weights that’s an extra library to load. Another common one that slows down pages are things like Font Awesome icon libraries. A lot of page builders like Elementor will give you the option to use icons from Font Awesome, Themify, or Ionicons. The problem is that in order to use just one icon, the entire library is loaded. Use a single image instead. Some builders will let you use your own SVG files as icons like Oxygen and Bricks. I think Elementor just added that option recently too. The advantage of using your own is that the browser only has to load what you are using and not an entire library of icons. I see this happen a lot with local business websites. They often like to use one of those phone icons beside their phone number in the header. Sometimes an email icon beside an email address too or the Google Places pin beside an address. Because it loads in the header, this usually will slow down the LCP time. Use your own icons instead and speed it up.
  14. How to Identify Cumulative Layout Shifts With Core Web Vitals upon us, people are scrambling to optimize their sites. Mostly a waste of time, but it is what it is. Out of the 3 Core Web Vitals, cumulative layout shift (CLS) is the one I have seen people having the most trouble identifying. Obviously, seeing your score is easy in Pagespeed Insights, Lighthouse, or web.dev, but now how do you identify what is actually causing any shifts on your pages? It’s pretty simple actually. To do so, you are going to want to use Chrome’s Web Developer Tools. -Open the page you want to check in Chrome. -Click the dropdown menu on Chrome. -Go to More tools >> Developer tools This should open you up into a screen that shows Lighthouse and a bunch of other options along a menu at the top. -Click on the Performance tab at the top. -Then click on what looks like a refresh icon. -Let it load, and you will end up with something like the attached screenshot. If you have cumulative layout shifts happening, they will appear as red bars under the Experience row. You can zoom in on them a little bit by hovering your mouse over that area and using the mouse scroll wheel. (At least on PC’s. I have no idea how to do it on inferior Mac machines.) You can also click and drag the whole thing around if things start moving off the screen as you zoom in. If you hover over the red bars, in the website pane on the left it will highlight where that shift is happening. By the way, while you are here, you can also identify your Largest Contentful Pain (LCP) element as well. In the Timings row, you will see a black box labeled LCP. Hover over it and it will highlight your LCP element.
  15. Mike’s Tuesday Tips: When should you disavow links? Back in 2012, Google shook up the link building market with two massive actions. First, there was an enormous push to take down popular public networks. Anyone remember Build My Rank or Authority Link Network? Then in April, they unleashed the Penguin algorithm filter and sent many SEOs running around with their hair on fire. The Penguin algorithm was harsh. Probably too harsh to be honest. It weaponized link building. It was risky, but you could potentially get competitors penalized by throwing spammy links at them. I say it was risky because Google’s algorithm was far from perfect. You could just as easily strengthen their position as harm it. While the Penguin algorithm did a great job in many cases of punishing sites using low quality links, there were also a lot of innocent sites caught in the mix or sites who had hired an SEO not understanding that they used spammy link building. As a result, Google released it’s Disavow Tool in October of that year. Fun fact: Did you know that Bing actually released a Disavow Tool before Google? Yep. Bing’s came out in September of 2012. Since it’s release, people have debated its use. Early on, many of us cautioned against using it. Google generally does not tell you which links they have flagged as bad, except in some cases of manual penalties where they may give you a few examples. Overuse can actually hurt your rankings. (Some of us also suggested caution because we saw it as Google crowdsourcing to fix a problem they couldn’t figure out on their own. Basically they were saying, “Hey, why don’t you tell us which links are bad that you have been building? In exchange, here is a get out of jail free card.” I think our concerns were valid. A couple of years ago Google announced that they can pretty well identify bad links on their own now and just ignore them. Where do you think the data came from to train their AI and machine learning algorithms to do that?) Matt Cutts made a great analogy for how to use the tool. I’m paraphrasing, but he said you should use it like a scalpel, not a machete. There are only two cases where you should use the Disavow Tool. The first case is when you have received a manual penalty from Google related to link building. If this happens, you should try to actually have the offending links removed by those websites and fall back on the Disavow Tool for the ones you cannot get removed. The second case where you should use the Disavow Tool is when you see a massive drop in rankings AND you have seen some low quality links starting to pile up or maybe there was a recent influx of low quality links. If you have a page or pages hit by the Penguin filter because of bad links, you won’t see slight ranking drops. If you see drops of just a few spots, it’s not your links. You won’t drop from #1 to #3. You will see something more like drops of 50 spots or more. Sometimes you will drop out of the top 100 completely. In these cases, again, the best solution is to try to get links removed, but in cases involving hundreds or thousands of spammy links coming in, that will probably not work. You can use the Disavow Tool. How do you know which links to disavow? Well, Semrush has a great Toxicity filter you can look at, but do not just disavow all links it identifies as ‘toxic’. Use this filter as an indicator for links you should take a look at yourself. Only disavow links you have manually inspect yourself. Do not use 3rd party metrics like DA to identify low quality links. DA has nothing to do with the quality of a link (nor does any other 3rd party metric). If anything, those metrics are trying to give you a gauge for the potential strength of a link. Strength and quality are not the same thing. How do you recognize low quality or spammy links? Well, it’s a lot like United States Supreme Court Justice Potter Stewart famously said about pornography, “I know it when I see it.” Is the content a jumbled mess? Is the link and content at all relevant to what you do? Does the page even load? In short, if a prospect who had never heard of you came across the page and saw your link, would it hurt your brand image? Would you be embarrassed to be mentioned on that page? I consider links like blog comments, forum posts, social bookmarks, document sharing sites, and all insignificant wikis to be spam worth disavowing too. Lastly, if you do decide to disavow links, remember that your disavow file is kind of a living document. When you upload a file, it replaces the old one. The Disavow Tool does not store the old data. If you decide to disavow additional links, you should keep adding on to the same document and upload that file.
  16. There are two very simple and very logical reasons why Google and other search engines do not factor bounce rate into their ranking algorithm. The first reason is just a matter of access. Google does not have access to bounce rate data on many of the webpages in existence. For some reason, many people seem to think of Google as this omnipotent power that sees and knows all. That is just not the case. There are over 200 ranking signals in Google’s algorithm, all of which have different weightings. When a search query is made, Google is pulling data from its index and comparing all of the websites it has indexed based on those 200 signals. If Google were using bounce rate data, how would the algorithm compare a webpage where it has bounce rate data versus a webpage in which it has no bounce rate data? Which one is performing “better” for that ranking signal? Still not convinced? Fine. Let’s look at the second reason that Google is not using bounce rate in its ranking algorithm. Bounces are not always bad. They are not always a signal that there is something wrong with the page. For some reason, many marketers have this stigma stuck in their head that all bounces are bad. They are not. Let’s say you are running an emergency plumbing service and repair business. Someone in your community has a toilet that has suddenly started to overflow and they cannot fix it. They search for a local plumber in Google, see your page ranking first. They click the search result which brings them to the home page of your site. They like what they see and pick up the phone to call you (or your office) and see how fast you can help them out with their problem. They never visited another page on your site. They will register as a bounce, but they did exactly what you wanted them to do and they found exactly what they needed, right? Your webpage converted them immediately into a phone call and a possible job. That’s a good thing. And why should Google see it differently or ding your site for that? The same thing could be said if I am running an affiliate site. Usually an affiliate site is setup to drive traffic to a landing page and get them to click on an affiliate link. If they do not browse around on your site but click on the link, they are going to register as a bounce. Again, there is nothing wrong with that. They did exactly what you were hoping they would do. We obviously do not have access to their analytics to prove it, but look at a site like Wikipedia. I would venture a guess that their bounce rate is quite high. People generally end up at Wikipedia because they were looking for an answer to a specific query and one of Wikipedia’s pages came up. They visit the page and find the answer they were looking for. Some might click on an internal link on the page if they see something that interests them. The vast majority most likely do not and simply leave. Yet, Wikipedia ranks for everything. Does That Mean Bounce Rate Data is Useless? No. Not at all. Bounce rate data is useful for you. Not for search engines. A high bounce rate could be indicative of a problem on a webpage. It really depends what type of website you are running and what it is you are trying to get visitors to do. If you are running an ecommerce site where a particular page is bringing in a lot of traffic, but then the visitors are leaving without browsing other products, adding anything to their cart, etc., then there is likely something wrong with that page or the traffic coming to that page. Even then, believe it or not, it may not be a bad thing. You always want to take a closer look. I’ve relayed this story before, but I will share it again here. Before you go reworking a whole page or website, it is important to understand where the bounces are coming from. Who is bouncing, how did they find your site, and what pages are they bouncing from? I was looking at a client’s website one time and noticed that the bounce rate across the site was 43%. Most of the pages fit around that number, but there was one page where the bounce rate was 89%. That was unusual. Average time on the site was over 6 minutes, but on this particular page it was under 30 seconds. I took a closer look at the analytics, and found that search traffic was bouncing from that page at a much, much higher rate than traffic from other sources. Generally, if there is something wrong with the page, the bounce rate will be consistent among all sources of traffic. This was not the case. Through some digging, we found that the page was not only ranking highly for our target keyword, but it was also ranking highly for another keyword that was similar but highly unrelated to the page. In other words, the words in the phrase were close, but the definitions were much different. I cannot reveal the client’s site, but the difference in keyword phrases would be something like doggy style versus styles of dogs. The words are close, but have two completely different meanings. The targeted phrase was searched about 500 times per month on average. The untargeted phrase was searched about 12,000 times per month. That’s why the percentage of bounces was so high. In this situation, it was nothing to worry about. The bounces were coming from untargeted traffic. This is a perfect example of why you really need to take a close look at what the bounces are actually telling you.
  17. There are some big misconceptions out there about how nofollow works, so let’s clear it up. The first image is a simplified version of how a page passes on authority to other pages through its links. Each page has a set amount of linkjuice that it can pass on through its links. When nofollow was first introduced, it blocked that linkjuice from passing through links carrying the nofollow tag, and it would instead be redistributed among the remaining links on the page, making them stronger, as shown here: Many of us used this for what was known as PageRank sculpting. We could control the flow of linkjuice throughout our sites to boost the pages we really wanted to rank. Of course, Google didn’t like that, so in 2009 they changed how they handled nofollow. Here is how it is done now: Linkjuice still flows through links tagged as nofollow. It no longer gets redistributed among the remaining links on the page, but it does not get credited to the target page they are linking to. This is why it is a bad idea to nofollow internal links. You are actually bleeding out linkjuice by doing so. For some reason, people still think Google treats nofollow as illustrated in the first image, but that has not been the case since early 2009. Then there was the update in March of 2020, where Google again changed how they treat the nofollow tag. Up until then, it was treated as a directive. With the latest update, they instead treat it as a hint or request. They make up their own mind whether to treat a link as nofollow or to ignore the tag. They will never tell you if you they are obeying the nofollow tag or not on any links, so we have no idea if a link is really nofollow or not. They also added additional identifiers they want webmasters to use to identify sponsored links, affiliate links, etc. Can you still sculpt PageRank? Some of the min/maxers out there who really want to squeeze out every little value they can have found ways to sculpt PageRank even after Google changed how they handle nofollow. For a long time, Google had trouble parsing javascript. A common technique was to put lower-value links inside javascript code so that Google could not see them. People would do this for links to things like contact us, privacy, and terms of service pages. Google has gotten better at reading javascript, so this method really does not work anymore. The other way it was commonly done was to use iframes. Googlebot always skipped over iframes, so you could use them to hide links with less importance and sculpt PageRank that way. For years, the footer and parts of the header of Bruce Clay’s site used iframes to do this. Google does seem to read the content inside iframes these days, although I have seen some tests where they did so inconsistently. This method could still work, but it’s just not 100% reliable.
  18. Mike’s Tuesday Tips: Should you noindex category pages? I see this question come up a lot in regards to Wordpress, but the situation would be similar no matter what CMS you might be using. It depends on how you are using your categories. Most sites I see are using categories as part of their navigation or a sub-navigation. In those cases, you absolutely should not noindex the category pages. Different people from Google have said slightly different things in regards to this, but there are two messages we have heard pretty consistently over the past few years. First, if you noindex category pages, Google will likely treat them as soft 404s eventually. Not a huge deal, but it can trigger errors in Search Console. Just be aware of that. Second, over time, if you noindex category pages, Google will treat the links on those pages as nofollow. This is why I say it depends how you are using your category pages. If you have links pointing to your category pages (like you would if you use them in any type of navigation menu), you are pushing link equity into those pages, but nothing is coming back out of those pages. You are bleeding link equity. This can harm the internal link structure of your site. Simple rule of thumb: Do you have category pages that visitors might land on by following a link somewhere on your site? If the answer to that question is “yes”, then do not noindex them. If there is no way for visitors to find your category pages other than through your sitemap or by typing the URL directly into their browser, then it does not really matter if you noindex them or not. Objections: I often hear people say that they do not want to index their category pages for 1 of 2 reasons: Reason 1 - The page is low quality and full of nothing but duplicate content. Solution: Then make your category pages into something useful. Build them out more. Include some static content on the pages and not just nothing but post excerpts. Reason 2 - A category page is outranking the primary page they want to rank for a keyword. Solution: Do a better job optimizing your target page. It should not be difficult to outrank your own category page. Or push them both up higher and get them both ranking highly.
  19. This is an easy one, but one I get asked about a lot. How many links should you build per day? There is only one correct answer. As many good links as you can possibly get each day. Period. There are no exceptions. No buts. I don't care if the site is brand new or 20 years old. Google does not care about link velocity. Notice that I said good links. If you are using spam like blog comments, profile links, social bookmarks, etc., then yes link velocity matters because the faster you build them the faster you are likely to tip over the Penguin threshold. On the other hand, if you go slow, you will likely never rank anyhow. On top of the fact that Google doesn't care how many links you build per day, they also are not going to find all of your links at once anyhow. Some they might find the same day they go live. Others it might take them 3-4 weeks to discover. You can't control when links will be discovered, so trying to stick to some arbitrary number per day is silly anyhow.
  20. Sticking with the more grey/black hat theme of last week, let’s talk about some footprints that can be giving away your private network. Many of you have your own private network or have thought about creating one. No single one of these footprints will necessarily bring the Google Hammer crashing down on your head, but when you start combining them, they can make identifying network sites really easy for Google. -WordPress. There is no doubt it is a popular platform. It is the most common platform people are using to build their networks on. You do not have to avoid it completely, but if you combine it with these other possible footprints, you might be drawing unwanted attention. -Text logo/header. Most people just use the default text style header in WordPress. They are not taking the time to design a graphic header instead. -All posts are on the homepage. Owners of private networks get greedy. They want to squeeze as much link equity out of each site as they can. To that end, they put every one of their posts on the homepage of their sites. As a result, all of their external links are also on the homepage of the sites. This is also common when someone is selling links on a network. -Sample Page and Hello World post still exist. This is specific to WordPress, but I cannot count how many times I have stumbled on a network site where the default Sample Page and Hello World post are still published. A real site that someone cares about is not going to have those (usually). That is just carelessness. -About page. Common network sites often do not have an “About” page. -Privacy page. See above. -Every post has at least one external link. Now there are some legitimate websites that follow this pattern. Many news sites have links within just about every news story. However, when you combine this with the majority of the footprints in this section, it is just one more ding against you. -No social activity. Most network sites have no Facebook, Twitter, Instagram, etc. site attached to them. Think about real sites you come across and how few do not have these things. -Wide range of topics. This is another one where there are some legitimate websites that cover all sorts of topics, but many of the public network sites out there do it in an extremely jumbled way with no real organization or reasoning behind it. -No internal linking. Outside of a navigation menu and things like “recent posts” widgets, network sites commonly have no internal linking between posts. That’s a huge mistake. -Having the same external link profile. I have uncovered private network sites that are all linking to the exact same sites. How odd would it be to find 25+ sites that are all linking to the exact same 3-4 websites? Vary up your external link profile. Camouflage it. -Blocking robots. This one is a little more controversial. I know a lot of people who build private networks like to block spiders from places like Semrush and Ahrefs. To me, this could be a footprint Google could use to identify network sites. There might be a very valid reason for blocking them in some cases, but you show me 20 sites that all link to the same money site and all are blocking bots from common backlink indexes, and I will bet you $1 you just found 20 sites of someone’s network. -Contact information inside cPanel. Here is one most people are not aware of. When you sign up for a hosting account, by default the email address used for signing up gets plugged into the contact information panel inside of cPanel. Change that email to something random. This contact information gets published publicly. If you do a search for an email address inside of quotes that you have used in multiple hosting accounts, all of your domains can be pulled up that way. I once found 300+ network sites owned by the same person this way. Again, for the most part, these footprints on their own are not a big deal (other than the last one), but if you take 7 Wordpress sites, with text logos, no About or Privacy page, no internal links, all linking to the same sites, and they are blocking Semrush, Ahrefs, Moz, and Majestic… Well, you likely found yourself someone’s (not so) private network.
  21. Sticking with PPC this week, specifically Google and Bing Ads. This is a technique I have been using for a long time. I came up with a catchy acronym for it when I teach it to people. A.I.M.: Analyze, Identify, Move Then several years ago I was reading Perry Marshall’s book on Google Ads. He uses the same method, but calls it Peel & Stick. Admittedly, his name is much more catchy. Call it whatever you want. The concept is the same. The way you implement this is simple. (STEP 1) You first Analyze the keywords of an ad group. What you are looking for is any keyword that sticks out. We are primarily looking at CTRs here. Note, most people who I have encountered that do use this method only do this for the top performers. However, you should also use this for keywords that are not performing well. (STEP 2) You want to Identify keywords that are outperforming the rest of the group or underperforming the rest of the group. (STEP 3) The third step is to Move these keywords into their own single keyword ad groups (SKAG). In the case of over performing keywords, you are doing this hoping that if you write an ad targeted specifically for this keyword you will be able to boost its performance even higher. In the case of underperforming keywords, you are moving them into their own ad groups hoping that by putting the keyword by itself with ads specifically suited for that keyword, that you can boost its CTR to something more respectable. Depending on the situation, I will go a step further and create a unique landing page for each of these as well. The thinking being that I can improve the Quality Score, which means potentially higher positions in the ad listings, which can then lead to higher CTRs. You do not want your ad groups cannibalizing one another. This is even more common with Google’s expanding definition of exact match. Make sure when you pull a keyword out of an ad group, you add its exact match as a negative keyword to that ad group.
  22. Mike’s Tuesday Tips Forget everything you think you know about tiered linking. Tiered linking might be a little more of a grey/black hat area for most of you, but not everything has to be white hat on your own projects (definitely should be for customers). For 15+ years most of the gurus out there have been teaching tiered linking completely wrong. Everyone from Matthew Woodward to the guys behind SENuke have been advocating the wrong structures for tiered linking. Even recently, SEO Powersuite published an article on their site doing it wrong. (They changed their article after I called them out on it. Lol.) Basically everyone on Fiverr or any other marketplace you go to are doing tiered linking in a way that makes it highly inefficient and likely to not work at all. The common way tiered linking is shown is to create somewhere around 4-6 sites on tier 1 which then link to your target page. Tier 1 typically consists of Web 2.0 sites or other decent quality pages that you have a good bit of control over. Tier 2 will then usually either consist of a bunch of Web 2.0 sites or just plain spam (bookmarks, blog comments, forum comments, wikis, etc.). Then tier 3 is more spam. There are two problems with this structure. First of all, when you put a bunch of sites on tier 2, you are greatly diluting everything that comes behind it. Let’s say you put 20 Web 2.0s on tier 2. Then you create 1,000 links on tier 3. Great, but really you are only pointing 50 links at each tier 2 property, and considering that you are using low quality links, 50 is not going to move the needle at all. (To be honest, with these sorts of links, even 50,000 links are not going to move the needle much, if at all. You need to think bigger.) But the really significant problem is that your low quality links should never be a part of the actual tiered structure. Things like blog comments and wikis get deleted frequently. Social bookmarks often do not get indexed by search engines. Any type of break in the structure means that any links flowing to that point are now lost. The way tiered linking should be done is to keep your actual tiers small and make them sites you have control over. Either domains that you own or Web 2.0s (you technically don’t have control over a Web 2.0, but it is unlikely to be taken away from you unless you do something stupid with it). Tier 1, should be one site which then links to your money page. Tier 2, 2 sites. Tier 3, 4 sites. Tier 4 (if you want to go that far), 8 sites. That’s it. No bookmarks. No blog comments. None of that junk goes into the tiers. Then you take all your low quality links that you want to use and you point those at the sites in the tiers. What this means is that if a blog comment link gets deleted, it’s not destroying everything in the structure behind it. It doesn’t matter. You are unlikely to ever lose anything that appears in the actual tiered structure.
  23. Think about your outreach targets. This one may seem very obvious to many of you, but I still see people making this mistake pretty regularly. Outreach for links is one of the most popular forms of link building out there. It’s also one of the most difficult, time consuming, and frustrating. I did a consulting call with another SEO in early May with the main thing they wanted to talk about being link building. They really wanted to talk about their outreach approach because they were getting horrible results. We are talking about close to zero percent response rates. I was not shocked at the results when we started talking about what they were doing, and they are far from the only people I have seen do this over the years. I asked them how they were finding their targets for their outreach. They were making a pretty big mistake. They were using different combinations of search terms and search operators, but all of them included their primary keywords. In other words, if they were trying to rank a page on a recipe site for “dessert recipes”, they were reaching out to other sites that already ranked for “dessert recipes.” Just think about that for a moment. I’m trying to rank my own page about dessert recipes and someone is reaching out to me asking if I will link to their dessert recipe page. Even if I was clueless about SEO, that sounds like a bad idea. Why would I want to lead my traffic to a similar page on another site? If the website owner does understand a little bit about SEO, they are going to know that doing so could potentially help you to outrank them. Why would they want to do that? Your outreach should not be to direct competitors. Many times it shouldn’t even be to sites that are directly in the same niche if you want to have a good success rate. Let’s use an example to illustrate what I am talking about. Say you are doing outreach for a local mortgage lending company in Philadelphia. The last thing you want to do to find outreach targets is to search for things like “mortgage lenders in Philadelphia” or “FHA loans Philadelphia”. Instead, look at businesses that are related to and in many cases rely on mortgage lenders to operate. Title companies and real estate agents would be good examples. Construction companies that focus on home building would be another good one. Content on their site that educates their web visitors about topics such as FHA loans, improving credit scores to qualify for a mortgage, what to prepare in order to get a pre-approval, or why they should get a pre-approval before home shopping, can help to position them as knowledgeable within the field and someone a prospect would want to work with. At the same time, neither title companies, real estate agents, or construction companies are going to be heavily focused on trying to rank for mortgage-related keywords. Going back to the recipe example, and specifically the “dessert recipes” example, there are plenty of branches within this market you can look at. For example, there are a bunch of websites (and YouTube channels) devoted entirely to recipes for pressure cookers. These are site owners that probably care about ranking for search terms like “dessert recipes for pressure cookers” but not as much about just “dessert recipes”. You can also find sites that do reviews and tutorials of cooking gadgets like pressure cookers, air fryers, slow cookers, etc. They could be good targets to reach out to. There are also all kinds of bloggers covering topics like eating healthy, being a stay-at-home mom (or dad), etc. All great outreach targets. You can get even more creative than this. Think a little outside the box. Remember that the entire website you are reaching out to does not have to be relevant to your site in order for the link to be useful. There are lots of sites popping up that are devoted to providing information about becoming an online streamer. Most of the content on these sites revolves around what equipment to use, how to set up that equipment, setting up a schedule, engaging your audience, finding an audience, etc. Many streamers will stream for 8-12 hours at a time. You could reach out to some of these sites and pitch them the idea of publishing a piece of content about some great bite-size, healthy snacks you can make to eat while streaming. Be creative. Think outside of the box. You will have a lot more success in your outreach.
  24. These are a few common mistakes I see people make in doing keyword research for SEO. Mistake 1: Only targeting keywords with X number of searches per month. I commonly see people say to look for at least 1000 searches per month. Whatever the number is, this thinking ignores two very important factors: buyer intent and what are you selling. I don’t think I need to explain buyer intent to anyone here. What I mean by what you are selling is simple. What if the lifetime value of one customer/conversion is $10,000? Do you really care about search volume then? I’m going after any keyword where the buyer intent is high. I don’t care if it gets 10 searches per month. I just need one conversion each month and that will generate a 6 figure revenue stream. Now on the other hand, if you are building a made-for-AdSense type site, then yes, search volume is going to matter a whole lot more. In fact, I would probably ignore anything less than 10,000 searches per month as a primary keyword. Mistake 2: Looking at the number of results in the search index. I covered this one before in another Tuesday Tip, but it is worth mentioning again. The number of results in the search index has absolutely nothing to do with the level of competition for a keyword. It does not matter if there are 100,000 results or 100,000,000 results. All that matters is the strength of the top 3 pages (or in really high search volume keywords maybe top 5). If you can beat #3, then #4 through 100,000,000 do not matter. I don’t care what search operators you use either. Inurl:, Intitle:, etc. It tells you nothing about the level of competition. The KGR is BS. Mistake 3: Using competition level from Google Keyword Planner. Over the years, this might be the mistake I see most often repeated. The competition column in the Google Keyword Planner has nothing to do with the level of competition in organic search. The Keyword Planner is a tool for Google Ads, not SEO. It is telling you the level of competition among Google advertisers. If you ever see a third-party tool with a “Competition” column and it ranks them as Low, Medium, or High, they are most likely pulling this data from Google. Same thing applies. If anything, and I would still be careful about this, that data can be used to gauge buyer intent. The thinking being that if advertisers are willing to pay for ads, then that probably means they are making money off their ads. In other words, people doing that search are looking to buy something. Mistake 4: Not checking the plural or non-plural version of a keyword. Sometimes, when you change a search term to its plural version, the search intent changes in Google’s eyes and so do the results. Based on this you might want to create different content on another page to target the plural version or you may want to not target it at all. For example, when I search ‘insurance agent’ I do get the local search box, but in the organic searches I get things like job listings, job descriptions, how to become one, and some local search results mixed in. When I search for ‘insurance agents’, I see nothing but local results on page one. If you just glanced at the search terms, they may seem closely related, but based on what Google is showing I would not create the same content to target both of those searches.
  25. Mike's Tuesday Tips: I have been amazed over the past few years how many nonprofits I have encountered that were not aware of the Google Ads Grant Program. https://www.google.com/grants/index.html What could a nonprofit do with $10,000 per month in advertising on Google Ads? Could they get the word out about their cause to more people who might be in need of their services? Could they recruit more volunteers? Could they bring in more donations? The Google Ads Grant Program provides nonprofits with the opportunity to advertise on Google Ads for free. The program gives qualified organizations up to $10,000 per month to promote their initiatives on Google. This can be a great source for extra traffic to your website to share your cause with your community or the world. It’s an opportunity that every nonprofit should be taking advantage of. In order to qualify for the Google Ads Grant Program, nonprofits must meet the following criteria: -Hold current and valid 501(c)(3) status. -Acknowledge and agree to Google Grant’s required certifications. -Have a website that defines your nonprofit and its mission. -Hospitals, medical groups, academic institutions, childcare centers, -and government organizations are not eligible. -(Google does provide a similar program specifically for educational institutions.) In 2018, Google made changes to the Google Ad Grant Program and there are new guidelines that nonprofits must follow to remain eligible: -Maintain a minimum 5% CTR account-wide. Accounts that dip below 5% for 2 consecutive months will be suspended. -Maintain a minimum keyword quality score of 2. -Have a minimum of 2 ad groups per campaign. -Have a minimum of 2 ads per ad group. -Utilize at least 2 sitelink ad extensions. The primary one that nonprofits struggle with is maintaining the 5% CTR. It is account-wide, so it is always advisable to have a brand campaign running for the nonprofit as those will usually have a pretty high CTR.
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.