Jump to content

Mike Friedman

Administrator
  • Posts

    3,647
  • Joined

  • Last visited

  • Days Won

    380

Everything posted by Mike Friedman

  1. Mike’s Tuesday Tips: Last week was about identifying CLS and I mentioned how to identify LCP (Largest Contentful Paint) on your pages. This week, we are going to identify common issues that cause a slow LCP. LCP is probably the simplest of the Core Web Vitals to deal with. Having addressed this on a few hundred pages over the past year now, I can tell you that the most common cause of a slow LCP loading time is a slow server response. If you are on a shared hosting environment, especially ones that have been oversold (and often overhyped) - I’m looking at you Siteground - you can tweak things all you want, but there is only going to be so much speed you can squeeze out of the server. Want to figure out if your server has been oversold? Take a look at how many domains are hosted on your same server using this tool: https://viewdns.info/reverseip/ There are tons of other tools out there like this. If you really want to investigate, you can run all the sites through something like Semrush and take a look at their traffic estimates. You may have one or two sites on the same server getting tons of traffic and hogging up a huge amount of resources. If you really care about pagespeed, one of the best things you can do is get away from shared web hosting. Before someone comments about how they got good scores with shared hosting on Core Web Vitals, Pagespeed Insights, GTMetrix, or any other speed test you want to mention…. Sure, I believe you. However, think of just how much better your pages would load if you moved to a decent VPS or dedicated hosting solution. There is an example below for a new client I just started working on. This site is hosted on WPEngine, which is supposed to be one of the better hosts out there. We are getting server response times ranging from 1.5-2.2 seconds. Just fixing that alone without doing any other tweaks, will bring their LCP score in line with Google’s standards. Before we go crucifying WPEngine, there is also a possibility that the problem is on the development side of the site design. There might be some processes being called that have to complete server side before anything loads. With a lot of dynamic content, that can happen sometimes. A database cleanup may also solve some of that response time. If you do not want to switch hosts, another solution that may work is to use a content delivery network (CDN). Your mileage and experience may vary with these. For example, I have had some sites I have put on Cloudflare and saw drastic improvements. I have used it for other sites and it has actually slowed them down. The second big issue causing slow LCP load times is render blocking JavaScript and CSS. A browser will pause parsing HTML when it encounters external stylesheets and synchronous JavaScript tags. To speed up the loading of your LCP, you want to defer any non-critical JavaScript and CSS files. You should also minify and compress your CSS and JavaScript files. For any styles that are critical to your LCP and/or above the fold content, you can inline them, which means you place the style elements directly in the <head> of the page. The next thing I see frequently slowing down LCP times are images and/or videos. Make sure that you optimize and compress your images. Browsers load full images before adjusting them to the proper size to be viewed. If you are using an image that is 2000 x 1330 pixels, but it is only viewed at 600 x 400 in your page design, the browser is going to load that full 2000 x 1330 sized image. Before you bother with compressing anything, make sure you are using appropriate image sizes. Resize the image and then upload it back to your server. You can also lower the image quality. Load it up in Photoshop or something like GIMP and change the resolution by adjusting the pixels/inch. Many times you will find that a lower resolution still looks great on your web page, and it will be a much smaller file. One little trick I sometimes will use if I notice that changing resolutions on an image causes the quality to take a noticeable dip is I will make it a part of the design. I will toss a dark overlay over it and/or make it slightly blurry. I’ll do this if the image is being used as the background for a section. It helps the text over it to pop out anyhow. If you are using Wordpress, there are a lot of plugins and options out there for compressing images. There are also options for serving next-gen image options, mainly WebP images. WebP images are not supported in all browsers, so make sure you have JPG or some other format as backups to display. If you are using a video or slideshow, stop it. They are not a great user experience on mobile devices. Lastly, use a tool like GTMetrix to investigate the loading order of elements on your page. I hate GTMetrix scores and the fact that they default to desktop loading. GTMetrix is pretty useless for everything other than its waterfall display. There are other tools that have waterfall displays, but I find GTMetrix the easiest to work with. Take a look at what is loading before your LCP element. Are those things necessary? Is there anything that can be deferred or just eliminated? I’ve shaved significant times off of LCP scores just by getting rid of Google Fonts. Google Fonts are great, but they have to load from Google’s servers. Then if you use different font weights that’s an extra library to load. Another common one that slows down pages are things like Font Awesome icon libraries. A lot of page builders like Elementor will give you the option to use icons from Font Awesome, Themify, or Ionicons. The problem is that in order to use just one icon, the entire library is loaded. Use a single image instead. Some builders will let you use your own SVG files as icons like Oxygen and Bricks. I think Elementor just added that option recently too. The advantage of using your own is that the browser only has to load what you are using and not an entire library of icons. I see this happen a lot with local business websites. They often like to use one of those phone icons beside their phone number in the header. Sometimes an email icon beside an email address too or the Google Places pin beside an address. Because it loads in the header, this usually will slow down the LCP time. Use your own icons instead and speed it up.
  2. Saw last night that Norm passed away. Apparently he had been battling cancer for nearly 10 years, but kept it private, even from a lot of his family. I do remember how he used to lay into OJ on Weekend Update and was constantly getting crap from it from a few higher ups at NBC. It's likely what led to him eventually being fired.
  3. How to Identify Cumulative Layout Shifts With Core Web Vitals upon us, people are scrambling to optimize their sites. Mostly a waste of time, but it is what it is. Out of the 3 Core Web Vitals, cumulative layout shift (CLS) is the one I have seen people having the most trouble identifying. Obviously, seeing your score is easy in Pagespeed Insights, Lighthouse, or web.dev, but now how do you identify what is actually causing any shifts on your pages? It’s pretty simple actually. To do so, you are going to want to use Chrome’s Web Developer Tools. -Open the page you want to check in Chrome. -Click the dropdown menu on Chrome. -Go to More tools >> Developer tools This should open you up into a screen that shows Lighthouse and a bunch of other options along a menu at the top. -Click on the Performance tab at the top. -Then click on what looks like a refresh icon. -Let it load, and you will end up with something like the attached screenshot. If you have cumulative layout shifts happening, they will appear as red bars under the Experience row. You can zoom in on them a little bit by hovering your mouse over that area and using the mouse scroll wheel. (At least on PC’s. I have no idea how to do it on inferior Mac machines.) You can also click and drag the whole thing around if things start moving off the screen as you zoom in. If you hover over the red bars, in the website pane on the left it will highlight where that shift is happening. By the way, while you are here, you can also identify your Largest Contentful Pain (LCP) element as well. In the Timings row, you will see a black box labeled LCP. Hover over it and it will highlight your LCP element.
  4. Mike’s Tuesday Tips: When should you disavow links? Back in 2012, Google shook up the link building market with two massive actions. First, there was an enormous push to take down popular public networks. Anyone remember Build My Rank or Authority Link Network? Then in April, they unleashed the Penguin algorithm filter and sent many SEOs running around with their hair on fire. The Penguin algorithm was harsh. Probably too harsh to be honest. It weaponized link building. It was risky, but you could potentially get competitors penalized by throwing spammy links at them. I say it was risky because Google’s algorithm was far from perfect. You could just as easily strengthen their position as harm it. While the Penguin algorithm did a great job in many cases of punishing sites using low quality links, there were also a lot of innocent sites caught in the mix or sites who had hired an SEO not understanding that they used spammy link building. As a result, Google released it’s Disavow Tool in October of that year. Fun fact: Did you know that Bing actually released a Disavow Tool before Google? Yep. Bing’s came out in September of 2012. Since it’s release, people have debated its use. Early on, many of us cautioned against using it. Google generally does not tell you which links they have flagged as bad, except in some cases of manual penalties where they may give you a few examples. Overuse can actually hurt your rankings. (Some of us also suggested caution because we saw it as Google crowdsourcing to fix a problem they couldn’t figure out on their own. Basically they were saying, “Hey, why don’t you tell us which links are bad that you have been building? In exchange, here is a get out of jail free card.” I think our concerns were valid. A couple of years ago Google announced that they can pretty well identify bad links on their own now and just ignore them. Where do you think the data came from to train their AI and machine learning algorithms to do that?) Matt Cutts made a great analogy for how to use the tool. I’m paraphrasing, but he said you should use it like a scalpel, not a machete. There are only two cases where you should use the Disavow Tool. The first case is when you have received a manual penalty from Google related to link building. If this happens, you should try to actually have the offending links removed by those websites and fall back on the Disavow Tool for the ones you cannot get removed. The second case where you should use the Disavow Tool is when you see a massive drop in rankings AND you have seen some low quality links starting to pile up or maybe there was a recent influx of low quality links. If you have a page or pages hit by the Penguin filter because of bad links, you won’t see slight ranking drops. If you see drops of just a few spots, it’s not your links. You won’t drop from #1 to #3. You will see something more like drops of 50 spots or more. Sometimes you will drop out of the top 100 completely. In these cases, again, the best solution is to try to get links removed, but in cases involving hundreds or thousands of spammy links coming in, that will probably not work. You can use the Disavow Tool. How do you know which links to disavow? Well, Semrush has a great Toxicity filter you can look at, but do not just disavow all links it identifies as ‘toxic’. Use this filter as an indicator for links you should take a look at yourself. Only disavow links you have manually inspect yourself. Do not use 3rd party metrics like DA to identify low quality links. DA has nothing to do with the quality of a link (nor does any other 3rd party metric). If anything, those metrics are trying to give you a gauge for the potential strength of a link. Strength and quality are not the same thing. How do you recognize low quality or spammy links? Well, it’s a lot like United States Supreme Court Justice Potter Stewart famously said about pornography, “I know it when I see it.” Is the content a jumbled mess? Is the link and content at all relevant to what you do? Does the page even load? In short, if a prospect who had never heard of you came across the page and saw your link, would it hurt your brand image? Would you be embarrassed to be mentioned on that page? I consider links like blog comments, forum posts, social bookmarks, document sharing sites, and all insignificant wikis to be spam worth disavowing too. Lastly, if you do decide to disavow links, remember that your disavow file is kind of a living document. When you upload a file, it replaces the old one. The Disavow Tool does not store the old data. If you decide to disavow additional links, you should keep adding on to the same document and upload that file.
  5. Those girls aren't going to get through med school without our help.
  6. Dan hasn't posted here in like a year, but mention OnlyFans and out he pops. 😂
  7. Mike’s Tuesday Tips: Should you noindex category pages? I see this question come up a lot in regards to Wordpress, but the situation would be similar no matter what CMS you might be using. It depends on how you are using your categories. Most sites I see are using categories as part of their navigation or a sub-navigation. In those cases, you absolutely should not noindex the category pages. Different people from Google have said slightly different things in regards to this, but there are two messages we have heard pretty consistently over the past few years. First, if you noindex category pages, Google will likely treat them as soft 404s eventually. Not a huge deal, but it can trigger errors in Search Console. Just be aware of that. Second, over time, if you noindex category pages, Google will treat the links on those pages as nofollow. This is why I say it depends how you are using your category pages. If you have links pointing to your category pages (like you would if you use them in any type of navigation menu), you are pushing link equity into those pages, but nothing is coming back out of those pages. You are bleeding link equity. This can harm the internal link structure of your site. Simple rule of thumb: Do you have category pages that visitors might land on by following a link somewhere on your site? If the answer to that question is “yes”, then do not noindex them. If there is no way for visitors to find your category pages other than through your sitemap or by typing the URL directly into their browser, then it does not really matter if you noindex them or not. Objections: I often hear people say that they do not want to index their category pages for 1 of 2 reasons: Reason 1 - The page is low quality and full of nothing but duplicate content. Solution: Then make your category pages into something useful. Build them out more. Include some static content on the pages and not just nothing but post excerpts. Reason 2 - A category page is outranking the primary page they want to rank for a keyword. Solution: Do a better job optimizing your target page. It should not be difficult to outrank your own category page. Or push them both up higher and get them both ranking highly.
  8. There are two very simple and very logical reasons why Google and other search engines do not factor bounce rate into their ranking algorithm. The first reason is just a matter of access. Google does not have access to bounce rate data on many of the webpages in existence. For some reason, many people seem to think of Google as this omnipotent power that sees and knows all. That is just not the case. There are over 200 ranking signals in Google’s algorithm, all of which have different weightings. When a search query is made, Google is pulling data from its index and comparing all of the websites it has indexed based on those 200 signals. If Google were using bounce rate data, how would the algorithm compare a webpage where it has bounce rate data versus a webpage in which it has no bounce rate data? Which one is performing “better” for that ranking signal? Still not convinced? Fine. Let’s look at the second reason that Google is not using bounce rate in its ranking algorithm. Bounces are not always bad. They are not always a signal that there is something wrong with the page. For some reason, many marketers have this stigma stuck in their head that all bounces are bad. They are not. Let’s say you are running an emergency plumbing service and repair business. Someone in your community has a toilet that has suddenly started to overflow and they cannot fix it. They search for a local plumber in Google, see your page ranking first. They click the search result which brings them to the home page of your site. They like what they see and pick up the phone to call you (or your office) and see how fast you can help them out with their problem. They never visited another page on your site. They will register as a bounce, but they did exactly what you wanted them to do and they found exactly what they needed, right? Your webpage converted them immediately into a phone call and a possible job. That’s a good thing. And why should Google see it differently or ding your site for that? The same thing could be said if I am running an affiliate site. Usually an affiliate site is setup to drive traffic to a landing page and get them to click on an affiliate link. If they do not browse around on your site but click on the link, they are going to register as a bounce. Again, there is nothing wrong with that. They did exactly what you were hoping they would do. We obviously do not have access to their analytics to prove it, but look at a site like Wikipedia. I would venture a guess that their bounce rate is quite high. People generally end up at Wikipedia because they were looking for an answer to a specific query and one of Wikipedia’s pages came up. They visit the page and find the answer they were looking for. Some might click on an internal link on the page if they see something that interests them. The vast majority most likely do not and simply leave. Yet, Wikipedia ranks for everything. Does That Mean Bounce Rate Data is Useless? No. Not at all. Bounce rate data is useful for you. Not for search engines. A high bounce rate could be indicative of a problem on a webpage. It really depends what type of website you are running and what it is you are trying to get visitors to do. If you are running an ecommerce site where a particular page is bringing in a lot of traffic, but then the visitors are leaving without browsing other products, adding anything to their cart, etc., then there is likely something wrong with that page or the traffic coming to that page. Even then, believe it or not, it may not be a bad thing. You always want to take a closer look. I’ve relayed this story before, but I will share it again here. Before you go reworking a whole page or website, it is important to understand where the bounces are coming from. Who is bouncing, how did they find your site, and what pages are they bouncing from? I was looking at a client’s website one time and noticed that the bounce rate across the site was 43%. Most of the pages fit around that number, but there was one page where the bounce rate was 89%. That was unusual. Average time on the site was over 6 minutes, but on this particular page it was under 30 seconds. I took a closer look at the analytics, and found that search traffic was bouncing from that page at a much, much higher rate than traffic from other sources. Generally, if there is something wrong with the page, the bounce rate will be consistent among all sources of traffic. This was not the case. Through some digging, we found that the page was not only ranking highly for our target keyword, but it was also ranking highly for another keyword that was similar but highly unrelated to the page. In other words, the words in the phrase were close, but the definitions were much different. I cannot reveal the client’s site, but the difference in keyword phrases would be something like doggy style versus styles of dogs. The words are close, but have two completely different meanings. The targeted phrase was searched about 500 times per month on average. The untargeted phrase was searched about 12,000 times per month. That’s why the percentage of bounces was so high. In this situation, it was nothing to worry about. The bounces were coming from untargeted traffic. This is a perfect example of why you really need to take a close look at what the bounces are actually telling you.
  9. I was listening to a podcast not too long ago. I cannot remember which one, but there was a female guest who mentioned she used to sell pictures of her feet. There was one guy in particular paying her like $2000 a month for them. It helped get her through college. I couldn't help thinking about all the other things I could do with $24,000/yr versus looking at pictures of some stranger's feet. I guess when you have enough money to waste $24,000 on feet pictures, you really don't miss the money.
  10. I laughed. Not because of your comment, but because you were paying for porn.
  11. https://www.bloomberg.com/news/articles/2021-08-19/onlyfans-to-block-sexually-explicit-videos-starting-in-october This seems like business suicide to me. I guess they have things other than adult content there, but I have never heard of anyone visiting it or paying money to the platform for anything other than the adult content.
  12. I have a Brother laser printer that is probably 8 or 9 years old now. Still works perfectly. Only issue with it is sometimes when I scan things, it pulls them in slightly crooked. You would probably not even notice it unless I pointed it out to you, but I know about it so see it all the time. There is probably an easy solution to fix it. Just need to clean some spools or something I'm sure. Cannot emphasize enough how much better it is to have a laser printer than a inkjet printer. We don't even print off that much stuff, but I was buying new inkjet cartridges about every 2 months at like $20-30 a piece. With the laster printer, I have replaced the toner cartridges twice. I think the black one I may have replaced 3 times now. And they are not that expensive. When I got it, I think it was about $450. The printer has more than paid for itself. I honestly have not had an issues with Windows 10. It's been pretty stable for me as well. One time I had to reinstall it. That was it. I'm sure WIndows 11 will be similar. Honestly, 90% of the reason I moved to Mac last month was I wanted something distinctly different for work. I did not want to even be tempted to use it for anything other than work. It was that and the fact that thanks to the M1 chips I could get a relatively comparable machine for about the same price as a PC.
  13. Actually, I rarely buy laptops. They are just way harder to work off of all day. I've always preferred a desktop with my dual monitors. However, based on a few new clients it would seem there is some travel in my future, if the South stops trying to send us back into lockdowns here. My luck with laptops has not been good in the past. I've had Dell's, Asus, and HP models. I've had everything from mid-range to higher end ($1500 or so) laptops. They all have left me disappointed.
  14. Sticking with the more grey/black hat theme of last week, let’s talk about some footprints that can be giving away your private network. Many of you have your own private network or have thought about creating one. No single one of these footprints will necessarily bring the Google Hammer crashing down on your head, but when you start combining them, they can make identifying network sites really easy for Google. -WordPress. There is no doubt it is a popular platform. It is the most common platform people are using to build their networks on. You do not have to avoid it completely, but if you combine it with these other possible footprints, you might be drawing unwanted attention. -Text logo/header. Most people just use the default text style header in WordPress. They are not taking the time to design a graphic header instead. -All posts are on the homepage. Owners of private networks get greedy. They want to squeeze as much link equity out of each site as they can. To that end, they put every one of their posts on the homepage of their sites. As a result, all of their external links are also on the homepage of the sites. This is also common when someone is selling links on a network. -Sample Page and Hello World post still exist. This is specific to WordPress, but I cannot count how many times I have stumbled on a network site where the default Sample Page and Hello World post are still published. A real site that someone cares about is not going to have those (usually). That is just carelessness. -About page. Common network sites often do not have an “About” page. -Privacy page. See above. -Every post has at least one external link. Now there are some legitimate websites that follow this pattern. Many news sites have links within just about every news story. However, when you combine this with the majority of the footprints in this section, it is just one more ding against you. -No social activity. Most network sites have no Facebook, Twitter, Instagram, etc. site attached to them. Think about real sites you come across and how few do not have these things. -Wide range of topics. This is another one where there are some legitimate websites that cover all sorts of topics, but many of the public network sites out there do it in an extremely jumbled way with no real organization or reasoning behind it. -No internal linking. Outside of a navigation menu and things like “recent posts” widgets, network sites commonly have no internal linking between posts. That’s a huge mistake. -Having the same external link profile. I have uncovered private network sites that are all linking to the exact same sites. How odd would it be to find 25+ sites that are all linking to the exact same 3-4 websites? Vary up your external link profile. Camouflage it. -Blocking robots. This one is a little more controversial. I know a lot of people who build private networks like to block spiders from places like Semrush and Ahrefs. To me, this could be a footprint Google could use to identify network sites. There might be a very valid reason for blocking them in some cases, but you show me 20 sites that all link to the same money site and all are blocking bots from common backlink indexes, and I will bet you $1 you just found 20 sites of someone’s network. -Contact information inside cPanel. Here is one most people are not aware of. When you sign up for a hosting account, by default the email address used for signing up gets plugged into the contact information panel inside of cPanel. Change that email to something random. This contact information gets published publicly. If you do a search for an email address inside of quotes that you have used in multiple hosting accounts, all of your domains can be pulled up that way. I once found 300+ network sites owned by the same person this way. Again, for the most part, these footprints on their own are not a big deal (other than the last one), but if you take 7 Wordpress sites, with text logos, no About or Privacy page, no internal links, all linking to the same sites, and they are blocking Semrush, Ahrefs, Moz, and Majestic… Well, you likely found yourself someone’s (not so) private network.
  15. They are nearly identical except that the Mac Mini will have a bit better cooling and has more plugs. It really comes down to whether or not you want to be portable. If you want to be portable, then the MacBook Pro or MacBook Air is the obvious choice. The MacBook Air is what I would go with at $200 cheaper unless you plan on doing some video editing or graphic editing. The MacBook Air has only passive cooling, no fan. That is sufficient for most use, but I wanted the fan for when I do some video editing.
  16. I'm aware of the limitations on repairs and the inability to upgrade them. However, I've never gotten more than about 3 years use out of any Windows laptop, so for about $1000, if I top that, it's a win. Not to mention that the last year of that 3rd year was usually dreadful with fans running at full RPMs and the laptops struggling with heat and performance. I upgrade computers fairly often, so the Windows 11 cutoff won't be an issue for me, but I can see how it would be for others. A lot could change in the next few years though.
  17. Mike’s Tuesday Tips Forget everything you think you know about tiered linking. Tiered linking might be a little more of a grey/black hat area for most of you, but not everything has to be white hat on your own projects (definitely should be for customers). For 15+ years most of the gurus out there have been teaching tiered linking completely wrong. Everyone from Matthew Woodward to the guys behind SENuke have been advocating the wrong structures for tiered linking. Even recently, SEO Powersuite published an article on their site doing it wrong. (They changed their article after I called them out on it. Lol.) Basically everyone on Fiverr or any other marketplace you go to are doing tiered linking in a way that makes it highly inefficient and likely to not work at all. The common way tiered linking is shown is to create somewhere around 4-6 sites on tier 1 which then link to your target page. Tier 1 typically consists of Web 2.0 sites or other decent quality pages that you have a good bit of control over. Tier 2 will then usually either consist of a bunch of Web 2.0 sites or just plain spam (bookmarks, blog comments, forum comments, wikis, etc.). Then tier 3 is more spam. There are two problems with this structure. First of all, when you put a bunch of sites on tier 2, you are greatly diluting everything that comes behind it. Let’s say you put 20 Web 2.0s on tier 2. Then you create 1,000 links on tier 3. Great, but really you are only pointing 50 links at each tier 2 property, and considering that you are using low quality links, 50 is not going to move the needle at all. (To be honest, with these sorts of links, even 50,000 links are not going to move the needle much, if at all. You need to think bigger.) But the really significant problem is that your low quality links should never be a part of the actual tiered structure. Things like blog comments and wikis get deleted frequently. Social bookmarks often do not get indexed by search engines. Any type of break in the structure means that any links flowing to that point are now lost. The way tiered linking should be done is to keep your actual tiers small and make them sites you have control over. Either domains that you own or Web 2.0s (you technically don’t have control over a Web 2.0, but it is unlikely to be taken away from you unless you do something stupid with it). Tier 1, should be one site which then links to your money page. Tier 2, 2 sites. Tier 3, 4 sites. Tier 4 (if you want to go that far), 8 sites. That’s it. No bookmarks. No blog comments. None of that junk goes into the tiers. Then you take all your low quality links that you want to use and you point those at the sites in the tiers. What this means is that if a blog comment link gets deleted, it’s not destroying everything in the structure behind it. It doesn’t matter. You are unlikely to ever lose anything that appears in the actual tiered structure.
  18. I gave it another go. I'm in. My criticism of Apple has always been that you could get a more powerful PC for 60-70% of the cost. The new M1 chips changed that. I picked up a MacBook Pro. Liked it. Then I grabbed a Mac Mini a few days later. The only downside is I got versions with 256GB SSD drives, but I mostly use Google Drive and I have an external 1 TB drive that I can share between the machines. I cannot believe how powerful these machines are. The OS is an adjustment, but I do like how easily everything moves between multiple systems.
  19. Mike's Tuesday Tips: I have been amazed over the past few years how many nonprofits I have encountered that were not aware of the Google Ads Grant Program. https://www.google.com/grants/index.html What could a nonprofit do with $10,000 per month in advertising on Google Ads? Could they get the word out about their cause to more people who might be in need of their services? Could they recruit more volunteers? Could they bring in more donations? The Google Ads Grant Program provides nonprofits with the opportunity to advertise on Google Ads for free. The program gives qualified organizations up to $10,000 per month to promote their initiatives on Google. This can be a great source for extra traffic to your website to share your cause with your community or the world. It’s an opportunity that every nonprofit should be taking advantage of. In order to qualify for the Google Ads Grant Program, nonprofits must meet the following criteria: -Hold current and valid 501(c)(3) status. -Acknowledge and agree to Google Grant’s required certifications. -Have a website that defines your nonprofit and its mission. -Hospitals, medical groups, academic institutions, childcare centers, -and government organizations are not eligible. -(Google does provide a similar program specifically for educational institutions.) In 2018, Google made changes to the Google Ad Grant Program and there are new guidelines that nonprofits must follow to remain eligible: -Maintain a minimum 5% CTR account-wide. Accounts that dip below 5% for 2 consecutive months will be suspended. -Maintain a minimum keyword quality score of 2. -Have a minimum of 2 ad groups per campaign. -Have a minimum of 2 ads per ad group. -Utilize at least 2 sitelink ad extensions. The primary one that nonprofits struggle with is maintaining the 5% CTR. It is account-wide, so it is always advisable to have a brand campaign running for the nonprofit as those will usually have a pretty high CTR.
  20. Think about your outreach targets. This one may seem very obvious to many of you, but I still see people making this mistake pretty regularly. Outreach for links is one of the most popular forms of link building out there. It’s also one of the most difficult, time consuming, and frustrating. I did a consulting call with another SEO in early May with the main thing they wanted to talk about being link building. They really wanted to talk about their outreach approach because they were getting horrible results. We are talking about close to zero percent response rates. I was not shocked at the results when we started talking about what they were doing, and they are far from the only people I have seen do this over the years. I asked them how they were finding their targets for their outreach. They were making a pretty big mistake. They were using different combinations of search terms and search operators, but all of them included their primary keywords. In other words, if they were trying to rank a page on a recipe site for “dessert recipes”, they were reaching out to other sites that already ranked for “dessert recipes.” Just think about that for a moment. I’m trying to rank my own page about dessert recipes and someone is reaching out to me asking if I will link to their dessert recipe page. Even if I was clueless about SEO, that sounds like a bad idea. Why would I want to lead my traffic to a similar page on another site? If the website owner does understand a little bit about SEO, they are going to know that doing so could potentially help you to outrank them. Why would they want to do that? Your outreach should not be to direct competitors. Many times it shouldn’t even be to sites that are directly in the same niche if you want to have a good success rate. Let’s use an example to illustrate what I am talking about. Say you are doing outreach for a local mortgage lending company in Philadelphia. The last thing you want to do to find outreach targets is to search for things like “mortgage lenders in Philadelphia” or “FHA loans Philadelphia”. Instead, look at businesses that are related to and in many cases rely on mortgage lenders to operate. Title companies and real estate agents would be good examples. Construction companies that focus on home building would be another good one. Content on their site that educates their web visitors about topics such as FHA loans, improving credit scores to qualify for a mortgage, what to prepare in order to get a pre-approval, or why they should get a pre-approval before home shopping, can help to position them as knowledgeable within the field and someone a prospect would want to work with. At the same time, neither title companies, real estate agents, or construction companies are going to be heavily focused on trying to rank for mortgage-related keywords. Going back to the recipe example, and specifically the “dessert recipes” example, there are plenty of branches within this market you can look at. For example, there are a bunch of websites (and YouTube channels) devoted entirely to recipes for pressure cookers. These are site owners that probably care about ranking for search terms like “dessert recipes for pressure cookers” but not as much about just “dessert recipes”. You can also find sites that do reviews and tutorials of cooking gadgets like pressure cookers, air fryers, slow cookers, etc. They could be good targets to reach out to. There are also all kinds of bloggers covering topics like eating healthy, being a stay-at-home mom (or dad), etc. All great outreach targets. You can get even more creative than this. Think a little outside the box. Remember that the entire website you are reaching out to does not have to be relevant to your site in order for the link to be useful. There are lots of sites popping up that are devoted to providing information about becoming an online streamer. Most of the content on these sites revolves around what equipment to use, how to set up that equipment, setting up a schedule, engaging your audience, finding an audience, etc. Many streamers will stream for 8-12 hours at a time. You could reach out to some of these sites and pitch them the idea of publishing a piece of content about some great bite-size, healthy snacks you can make to eat while streaming. Be creative. Think outside of the box. You will have a lot more success in your outreach.
  21. Sticking with PPC this week, specifically Google and Bing Ads. This is a technique I have been using for a long time. I came up with a catchy acronym for it when I teach it to people. A.I.M.: Analyze, Identify, Move Then several years ago I was reading Perry Marshall’s book on Google Ads. He uses the same method, but calls it Peel & Stick. Admittedly, his name is much more catchy. Call it whatever you want. The concept is the same. The way you implement this is simple. (STEP 1) You first Analyze the keywords of an ad group. What you are looking for is any keyword that sticks out. We are primarily looking at CTRs here. Note, most people who I have encountered that do use this method only do this for the top performers. However, you should also use this for keywords that are not performing well. (STEP 2) You want to Identify keywords that are outperforming the rest of the group or underperforming the rest of the group. (STEP 3) The third step is to Move these keywords into their own single keyword ad groups (SKAG). In the case of over performing keywords, you are doing this hoping that if you write an ad targeted specifically for this keyword you will be able to boost its performance even higher. In the case of underperforming keywords, you are moving them into their own ad groups hoping that by putting the keyword by itself with ads specifically suited for that keyword, that you can boost its CTR to something more respectable. Depending on the situation, I will go a step further and create a unique landing page for each of these as well. The thinking being that I can improve the Quality Score, which means potentially higher positions in the ad listings, which can then lead to higher CTRs. You do not want your ad groups cannibalizing one another. This is even more common with Google’s expanding definition of exact match. Make sure when you pull a keyword out of an ad group, you add its exact match as a negative keyword to that ad group.
  22. Mike’s Tuesday Tips: These are just a couple of common mistakes I see when auditing Google Ads accounts. This is going to be strictly for search based text ads. I’m not going to get into shopping ads or display ads here. All of this would apply to BingAds campaigns as well. #1 Campaigns are Where You Control What You Spend One of the most common mistakes I see is in the way that an account is organized. Always remember that it is at the campaign level where you control the budget. If you are advertising multiple products or services, each one of them should have their own campaign. The reason for that is simple. Let’s say you are selling men’s clothing: shirts, pants, socks, and shoes. You would have each of those in their own campaign. In fact, you may even divide it up further. One campaign for dress shirts. One campaign for tshirts. One campaign for running shirts. And so on. Same thing for pants. A campaign for dress pants. A campaign for jeans. A campaign for sweatpants. A campaign for exercise pants. The mistake I see people frequently make is they make one campaign for shirts, and then divide things up at the ad group level. They create an ad group for dress shirts, another for tshirts, and another for running shirts. The problem with that organization is simple. What if your highest converting or most profitable shirts that you sell are dress shirts and you want to increase your ad spend on them? If you increase the money on your shirts campaign, it is going to be spent on all the ad groups. You have no ability to funnel the money towards a specific ad group. Depending on your inventory, you may even get more granular than what I suggested above. Instead of just dress shirts, you might want separate campaigns for short-sleeve dress shirts, long-sleeve dress shirts, dress shirts for boys, and campaigns for button-down collars and non-button-down collars. However you do it, always keep in mind that you control the budget at the campaign level and any products or services that you want to have complete control over the ad spend on need to have their own campaign. But Mike, I can just move products into their own campaign if I need to in the future. Sure, you can, but you lose the historical data (Quality Scores, conversion data, automated bidding data, ad performance, etc.). You are basically starting from scratch with a new campaign when you do that. #2 Keep Your Ad Groups Small The second most common mistake I see is ad groups with tons of keywords. Over the years, I have lost count of how many accounts I have seen where they just had one ad group in a campaign and all their keywords were thrown into it. Generally, your ad groups should not be more than 3-5 keywords each. I don’t go over 10 max. The reason for this is Quality Scores and Ad Rank. Your Quality Scores are impacted by several factors. The most common and important ones: Click-through rate of your ads. The relevance of each keyword to its ad group and to its ad text. The relevance and quality of your landing page. Historical Google Ads account performance. It is easier to control those relevance factors with tightly constructed ad groups and landing pages. The more broad your keywords get, the harder it is to make the ads and landing pages hyper-relevant. This is also good for conversions. It’s easier to convey a consistent message from search to ad to landing page this way. Today with the expanding definition of exact match search queries in Google Ads, it becomes even less necessary to have lots of different keywords in the same ad group. Why should you be worried about Quality Scores? Well, that’s the next one. #3 Not Worrying About Quality Scores This is a huge mistake costing advertisers a ton of money. I have often audited ad accounts and found lots of search terms with Quality Scores of 3 and less. The exact method for calculating Ad Rank is not shared publicly by Google, but Quality Score is a factor. In fact, they use a real-time Quality Score that does not get shared with you and takes into account things like proximity of the searcher, the time of day, nature of the search term, etc. What we get to see and work with is a general Quality Score. Why this matters is because it helps to determine how much you will pay per click. Higher QS’s get discounted positions in the advertising auctions. Lower QS’s have to pay more than other advertisers in order to show in the same position. Lower QS’s also means that oftentimes your ad just won’t be shown at all.
  23. These are a few common mistakes I see people make in doing keyword research for SEO. Mistake 1: Only targeting keywords with X number of searches per month. I commonly see people say to look for at least 1000 searches per month. Whatever the number is, this thinking ignores two very important factors: buyer intent and what are you selling. I don’t think I need to explain buyer intent to anyone here. What I mean by what you are selling is simple. What if the lifetime value of one customer/conversion is $10,000? Do you really care about search volume then? I’m going after any keyword where the buyer intent is high. I don’t care if it gets 10 searches per month. I just need one conversion each month and that will generate a 6 figure revenue stream. Now on the other hand, if you are building a made-for-AdSense type site, then yes, search volume is going to matter a whole lot more. In fact, I would probably ignore anything less than 10,000 searches per month as a primary keyword. Mistake 2: Looking at the number of results in the search index. I covered this one before in another Tuesday Tip, but it is worth mentioning again. The number of results in the search index has absolutely nothing to do with the level of competition for a keyword. It does not matter if there are 100,000 results or 100,000,000 results. All that matters is the strength of the top 3 pages (or in really high search volume keywords maybe top 5). If you can beat #3, then #4 through 100,000,000 do not matter. I don’t care what search operators you use either. Inurl:, Intitle:, etc. It tells you nothing about the level of competition. The KGR is BS. Mistake 3: Using competition level from Google Keyword Planner. Over the years, this might be the mistake I see most often repeated. The competition column in the Google Keyword Planner has nothing to do with the level of competition in organic search. The Keyword Planner is a tool for Google Ads, not SEO. It is telling you the level of competition among Google advertisers. If you ever see a third-party tool with a “Competition” column and it ranks them as Low, Medium, or High, they are most likely pulling this data from Google. Same thing applies. If anything, and I would still be careful about this, that data can be used to gauge buyer intent. The thinking being that if advertisers are willing to pay for ads, then that probably means they are making money off their ads. In other words, people doing that search are looking to buy something. Mistake 4: Not checking the plural or non-plural version of a keyword. Sometimes, when you change a search term to its plural version, the search intent changes in Google’s eyes and so do the results. Based on this you might want to create different content on another page to target the plural version or you may want to not target it at all. For example, when I search ‘insurance agent’ I do get the local search box, but in the organic searches I get things like job listings, job descriptions, how to become one, and some local search results mixed in. When I search for ‘insurance agents’, I see nothing but local results on page one. If you just glanced at the search terms, they may seem closely related, but based on what Google is showing I would not create the same content to target both of those searches.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.