Jump to content

Mike Friedman

Administrator
  • Posts

    3,647
  • Joined

  • Last visited

  • Days Won

    380

Mike Friedman last won the day on September 15

Mike Friedman had the most liked content!

1 Follower

About Mike Friedman

Contact Methods

  • Website URL
    http://clicked.marketing
  • Skype
    the-seo-pub

Profile Information

  • Interests
    Your success!

Recent Profile Visitors

7,524 profile views

Mike Friedman's Achievements

  1. Mike’s Tuesday Tips: Last week was about identifying CLS and I mentioned how to identify LCP (Largest Contentful Paint) on your pages. This week, we are going to identify common issues that cause a slow LCP. LCP is probably the simplest of the Core Web Vitals to deal with. Having addressed this on a few hundred pages over the past year now, I can tell you that the most common cause of a slow LCP loading time is a slow server response. If you are on a shared hosting environment, especially ones that have been oversold (and often overhyped) - I’m looking at you Siteground - you can tweak things all you want, but there is only going to be so much speed you can squeeze out of the server. Want to figure out if your server has been oversold? Take a look at how many domains are hosted on your same server using this tool: https://viewdns.info/reverseip/ There are tons of other tools out there like this. If you really want to investigate, you can run all the sites through something like Semrush and take a look at their traffic estimates. You may have one or two sites on the same server getting tons of traffic and hogging up a huge amount of resources. If you really care about pagespeed, one of the best things you can do is get away from shared web hosting. Before someone comments about how they got good scores with shared hosting on Core Web Vitals, Pagespeed Insights, GTMetrix, or any other speed test you want to mention…. Sure, I believe you. However, think of just how much better your pages would load if you moved to a decent VPS or dedicated hosting solution. There is an example below for a new client I just started working on. This site is hosted on WPEngine, which is supposed to be one of the better hosts out there. We are getting server response times ranging from 1.5-2.2 seconds. Just fixing that alone without doing any other tweaks, will bring their LCP score in line with Google’s standards. Before we go crucifying WPEngine, there is also a possibility that the problem is on the development side of the site design. There might be some processes being called that have to complete server side before anything loads. With a lot of dynamic content, that can happen sometimes. A database cleanup may also solve some of that response time. If you do not want to switch hosts, another solution that may work is to use a content delivery network (CDN). Your mileage and experience may vary with these. For example, I have had some sites I have put on Cloudflare and saw drastic improvements. I have used it for other sites and it has actually slowed them down. The second big issue causing slow LCP load times is render blocking JavaScript and CSS. A browser will pause parsing HTML when it encounters external stylesheets and synchronous JavaScript tags. To speed up the loading of your LCP, you want to defer any non-critical JavaScript and CSS files. You should also minify and compress your CSS and JavaScript files. For any styles that are critical to your LCP and/or above the fold content, you can inline them, which means you place the style elements directly in the <head> of the page. The next thing I see frequently slowing down LCP times are images and/or videos. Make sure that you optimize and compress your images. Browsers load full images before adjusting them to the proper size to be viewed. If you are using an image that is 2000 x 1330 pixels, but it is only viewed at 600 x 400 in your page design, the browser is going to load that full 2000 x 1330 sized image. Before you bother with compressing anything, make sure you are using appropriate image sizes. Resize the image and then upload it back to your server. You can also lower the image quality. Load it up in Photoshop or something like GIMP and change the resolution by adjusting the pixels/inch. Many times you will find that a lower resolution still looks great on your web page, and it will be a much smaller file. One little trick I sometimes will use if I notice that changing resolutions on an image causes the quality to take a noticeable dip is I will make it a part of the design. I will toss a dark overlay over it and/or make it slightly blurry. I’ll do this if the image is being used as the background for a section. It helps the text over it to pop out anyhow. If you are using Wordpress, there are a lot of plugins and options out there for compressing images. There are also options for serving next-gen image options, mainly WebP images. WebP images are not supported in all browsers, so make sure you have JPG or some other format as backups to display. If you are using a video or slideshow, stop it. They are not a great user experience on mobile devices. Lastly, use a tool like GTMetrix to investigate the loading order of elements on your page. I hate GTMetrix scores and the fact that they default to desktop loading. GTMetrix is pretty useless for everything other than its waterfall display. There are other tools that have waterfall displays, but I find GTMetrix the easiest to work with. Take a look at what is loading before your LCP element. Are those things necessary? Is there anything that can be deferred or just eliminated? I’ve shaved significant times off of LCP scores just by getting rid of Google Fonts. Google Fonts are great, but they have to load from Google’s servers. Then if you use different font weights that’s an extra library to load. Another common one that slows down pages are things like Font Awesome icon libraries. A lot of page builders like Elementor will give you the option to use icons from Font Awesome, Themify, or Ionicons. The problem is that in order to use just one icon, the entire library is loaded. Use a single image instead. Some builders will let you use your own SVG files as icons like Oxygen and Bricks. I think Elementor just added that option recently too. The advantage of using your own is that the browser only has to load what you are using and not an entire library of icons. I see this happen a lot with local business websites. They often like to use one of those phone icons beside their phone number in the header. Sometimes an email icon beside an email address too or the Google Places pin beside an address. Because it loads in the header, this usually will slow down the LCP time. Use your own icons instead and speed it up.
  2. Saw last night that Norm passed away. Apparently he had been battling cancer for nearly 10 years, but kept it private, even from a lot of his family. I do remember how he used to lay into OJ on Weekend Update and was constantly getting crap from it from a few higher ups at NBC. It's likely what led to him eventually being fired.
  3. How to Identify Cumulative Layout Shifts With Core Web Vitals upon us, people are scrambling to optimize their sites. Mostly a waste of time, but it is what it is. Out of the 3 Core Web Vitals, cumulative layout shift (CLS) is the one I have seen people having the most trouble identifying. Obviously, seeing your score is easy in Pagespeed Insights, Lighthouse, or web.dev, but now how do you identify what is actually causing any shifts on your pages? It’s pretty simple actually. To do so, you are going to want to use Chrome’s Web Developer Tools. -Open the page you want to check in Chrome. -Click the dropdown menu on Chrome. -Go to More tools >> Developer tools This should open you up into a screen that shows Lighthouse and a bunch of other options along a menu at the top. -Click on the Performance tab at the top. -Then click on what looks like a refresh icon. -Let it load, and you will end up with something like the attached screenshot. If you have cumulative layout shifts happening, they will appear as red bars under the Experience row. You can zoom in on them a little bit by hovering your mouse over that area and using the mouse scroll wheel. (At least on PC’s. I have no idea how to do it on inferior Mac machines.) You can also click and drag the whole thing around if things start moving off the screen as you zoom in. If you hover over the red bars, in the website pane on the left it will highlight where that shift is happening. By the way, while you are here, you can also identify your Largest Contentful Pain (LCP) element as well. In the Timings row, you will see a black box labeled LCP. Hover over it and it will highlight your LCP element.
  4. Mike’s Tuesday Tips: When should you disavow links? Back in 2012, Google shook up the link building market with two massive actions. First, there was an enormous push to take down popular public networks. Anyone remember Build My Rank or Authority Link Network? Then in April, they unleashed the Penguin algorithm filter and sent many SEOs running around with their hair on fire. The Penguin algorithm was harsh. Probably too harsh to be honest. It weaponized link building. It was risky, but you could potentially get competitors penalized by throwing spammy links at them. I say it was risky because Google’s algorithm was far from perfect. You could just as easily strengthen their position as harm it. While the Penguin algorithm did a great job in many cases of punishing sites using low quality links, there were also a lot of innocent sites caught in the mix or sites who had hired an SEO not understanding that they used spammy link building. As a result, Google released it’s Disavow Tool in October of that year. Fun fact: Did you know that Bing actually released a Disavow Tool before Google? Yep. Bing’s came out in September of 2012. Since it’s release, people have debated its use. Early on, many of us cautioned against using it. Google generally does not tell you which links they have flagged as bad, except in some cases of manual penalties where they may give you a few examples. Overuse can actually hurt your rankings. (Some of us also suggested caution because we saw it as Google crowdsourcing to fix a problem they couldn’t figure out on their own. Basically they were saying, “Hey, why don’t you tell us which links are bad that you have been building? In exchange, here is a get out of jail free card.” I think our concerns were valid. A couple of years ago Google announced that they can pretty well identify bad links on their own now and just ignore them. Where do you think the data came from to train their AI and machine learning algorithms to do that?) Matt Cutts made a great analogy for how to use the tool. I’m paraphrasing, but he said you should use it like a scalpel, not a machete. There are only two cases where you should use the Disavow Tool. The first case is when you have received a manual penalty from Google related to link building. If this happens, you should try to actually have the offending links removed by those websites and fall back on the Disavow Tool for the ones you cannot get removed. The second case where you should use the Disavow Tool is when you see a massive drop in rankings AND you have seen some low quality links starting to pile up or maybe there was a recent influx of low quality links. If you have a page or pages hit by the Penguin filter because of bad links, you won’t see slight ranking drops. If you see drops of just a few spots, it’s not your links. You won’t drop from #1 to #3. You will see something more like drops of 50 spots or more. Sometimes you will drop out of the top 100 completely. In these cases, again, the best solution is to try to get links removed, but in cases involving hundreds or thousands of spammy links coming in, that will probably not work. You can use the Disavow Tool. How do you know which links to disavow? Well, Semrush has a great Toxicity filter you can look at, but do not just disavow all links it identifies as ‘toxic’. Use this filter as an indicator for links you should take a look at yourself. Only disavow links you have manually inspect yourself. Do not use 3rd party metrics like DA to identify low quality links. DA has nothing to do with the quality of a link (nor does any other 3rd party metric). If anything, those metrics are trying to give you a gauge for the potential strength of a link. Strength and quality are not the same thing. How do you recognize low quality or spammy links? Well, it’s a lot like United States Supreme Court Justice Potter Stewart famously said about pornography, “I know it when I see it.” Is the content a jumbled mess? Is the link and content at all relevant to what you do? Does the page even load? In short, if a prospect who had never heard of you came across the page and saw your link, would it hurt your brand image? Would you be embarrassed to be mentioned on that page? I consider links like blog comments, forum posts, social bookmarks, document sharing sites, and all insignificant wikis to be spam worth disavowing too. Lastly, if you do decide to disavow links, remember that your disavow file is kind of a living document. When you upload a file, it replaces the old one. The Disavow Tool does not store the old data. If you decide to disavow additional links, you should keep adding on to the same document and upload that file.
  5. Those girls aren't going to get through med school without our help.
  6. Dan hasn't posted here in like a year, but mention OnlyFans and out he pops. 😂
  7. Mike’s Tuesday Tips: Should you noindex category pages? I see this question come up a lot in regards to Wordpress, but the situation would be similar no matter what CMS you might be using. It depends on how you are using your categories. Most sites I see are using categories as part of their navigation or a sub-navigation. In those cases, you absolutely should not noindex the category pages. Different people from Google have said slightly different things in regards to this, but there are two messages we have heard pretty consistently over the past few years. First, if you noindex category pages, Google will likely treat them as soft 404s eventually. Not a huge deal, but it can trigger errors in Search Console. Just be aware of that. Second, over time, if you noindex category pages, Google will treat the links on those pages as nofollow. This is why I say it depends how you are using your category pages. If you have links pointing to your category pages (like you would if you use them in any type of navigation menu), you are pushing link equity into those pages, but nothing is coming back out of those pages. You are bleeding link equity. This can harm the internal link structure of your site. Simple rule of thumb: Do you have category pages that visitors might land on by following a link somewhere on your site? If the answer to that question is “yes”, then do not noindex them. If there is no way for visitors to find your category pages other than through your sitemap or by typing the URL directly into their browser, then it does not really matter if you noindex them or not. Objections: I often hear people say that they do not want to index their category pages for 1 of 2 reasons: Reason 1 - The page is low quality and full of nothing but duplicate content. Solution: Then make your category pages into something useful. Build them out more. Include some static content on the pages and not just nothing but post excerpts. Reason 2 - A category page is outranking the primary page they want to rank for a keyword. Solution: Do a better job optimizing your target page. It should not be difficult to outrank your own category page. Or push them both up higher and get them both ranking highly.
  8. There are two very simple and very logical reasons why Google and other search engines do not factor bounce rate into their ranking algorithm. The first reason is just a matter of access. Google does not have access to bounce rate data on many of the webpages in existence. For some reason, many people seem to think of Google as this omnipotent power that sees and knows all. That is just not the case. There are over 200 ranking signals in Google’s algorithm, all of which have different weightings. When a search query is made, Google is pulling data from its index and comparing all of the websites it has indexed based on those 200 signals. If Google were using bounce rate data, how would the algorithm compare a webpage where it has bounce rate data versus a webpage in which it has no bounce rate data? Which one is performing “better” for that ranking signal? Still not convinced? Fine. Let’s look at the second reason that Google is not using bounce rate in its ranking algorithm. Bounces are not always bad. They are not always a signal that there is something wrong with the page. For some reason, many marketers have this stigma stuck in their head that all bounces are bad. They are not. Let’s say you are running an emergency plumbing service and repair business. Someone in your community has a toilet that has suddenly started to overflow and they cannot fix it. They search for a local plumber in Google, see your page ranking first. They click the search result which brings them to the home page of your site. They like what they see and pick up the phone to call you (or your office) and see how fast you can help them out with their problem. They never visited another page on your site. They will register as a bounce, but they did exactly what you wanted them to do and they found exactly what they needed, right? Your webpage converted them immediately into a phone call and a possible job. That’s a good thing. And why should Google see it differently or ding your site for that? The same thing could be said if I am running an affiliate site. Usually an affiliate site is setup to drive traffic to a landing page and get them to click on an affiliate link. If they do not browse around on your site but click on the link, they are going to register as a bounce. Again, there is nothing wrong with that. They did exactly what you were hoping they would do. We obviously do not have access to their analytics to prove it, but look at a site like Wikipedia. I would venture a guess that their bounce rate is quite high. People generally end up at Wikipedia because they were looking for an answer to a specific query and one of Wikipedia’s pages came up. They visit the page and find the answer they were looking for. Some might click on an internal link on the page if they see something that interests them. The vast majority most likely do not and simply leave. Yet, Wikipedia ranks for everything. Does That Mean Bounce Rate Data is Useless? No. Not at all. Bounce rate data is useful for you. Not for search engines. A high bounce rate could be indicative of a problem on a webpage. It really depends what type of website you are running and what it is you are trying to get visitors to do. If you are running an ecommerce site where a particular page is bringing in a lot of traffic, but then the visitors are leaving without browsing other products, adding anything to their cart, etc., then there is likely something wrong with that page or the traffic coming to that page. Even then, believe it or not, it may not be a bad thing. You always want to take a closer look. I’ve relayed this story before, but I will share it again here. Before you go reworking a whole page or website, it is important to understand where the bounces are coming from. Who is bouncing, how did they find your site, and what pages are they bouncing from? I was looking at a client’s website one time and noticed that the bounce rate across the site was 43%. Most of the pages fit around that number, but there was one page where the bounce rate was 89%. That was unusual. Average time on the site was over 6 minutes, but on this particular page it was under 30 seconds. I took a closer look at the analytics, and found that search traffic was bouncing from that page at a much, much higher rate than traffic from other sources. Generally, if there is something wrong with the page, the bounce rate will be consistent among all sources of traffic. This was not the case. Through some digging, we found that the page was not only ranking highly for our target keyword, but it was also ranking highly for another keyword that was similar but highly unrelated to the page. In other words, the words in the phrase were close, but the definitions were much different. I cannot reveal the client’s site, but the difference in keyword phrases would be something like doggy style versus styles of dogs. The words are close, but have two completely different meanings. The targeted phrase was searched about 500 times per month on average. The untargeted phrase was searched about 12,000 times per month. That’s why the percentage of bounces was so high. In this situation, it was nothing to worry about. The bounces were coming from untargeted traffic. This is a perfect example of why you really need to take a close look at what the bounces are actually telling you.
  9. I was listening to a podcast not too long ago. I cannot remember which one, but there was a female guest who mentioned she used to sell pictures of her feet. There was one guy in particular paying her like $2000 a month for them. It helped get her through college. I couldn't help thinking about all the other things I could do with $24,000/yr versus looking at pictures of some stranger's feet. I guess when you have enough money to waste $24,000 on feet pictures, you really don't miss the money.
  10. I laughed. Not because of your comment, but because you were paying for porn.
  11. https://www.bloomberg.com/news/articles/2021-08-19/onlyfans-to-block-sexually-explicit-videos-starting-in-october This seems like business suicide to me. I guess they have things other than adult content there, but I have never heard of anyone visiting it or paying money to the platform for anything other than the adult content.
  12. I have a Brother laser printer that is probably 8 or 9 years old now. Still works perfectly. Only issue with it is sometimes when I scan things, it pulls them in slightly crooked. You would probably not even notice it unless I pointed it out to you, but I know about it so see it all the time. There is probably an easy solution to fix it. Just need to clean some spools or something I'm sure. Cannot emphasize enough how much better it is to have a laser printer than a inkjet printer. We don't even print off that much stuff, but I was buying new inkjet cartridges about every 2 months at like $20-30 a piece. With the laster printer, I have replaced the toner cartridges twice. I think the black one I may have replaced 3 times now. And they are not that expensive. When I got it, I think it was about $450. The printer has more than paid for itself. I honestly have not had an issues with Windows 10. It's been pretty stable for me as well. One time I had to reinstall it. That was it. I'm sure WIndows 11 will be similar. Honestly, 90% of the reason I moved to Mac last month was I wanted something distinctly different for work. I did not want to even be tempted to use it for anything other than work. It was that and the fact that thanks to the M1 chips I could get a relatively comparable machine for about the same price as a PC.
  13. Actually, I rarely buy laptops. They are just way harder to work off of all day. I've always preferred a desktop with my dual monitors. However, based on a few new clients it would seem there is some travel in my future, if the South stops trying to send us back into lockdowns here. My luck with laptops has not been good in the past. I've had Dell's, Asus, and HP models. I've had everything from mid-range to higher end ($1500 or so) laptops. They all have left me disappointed.
  14. Sticking with the more grey/black hat theme of last week, let’s talk about some footprints that can be giving away your private network. Many of you have your own private network or have thought about creating one. No single one of these footprints will necessarily bring the Google Hammer crashing down on your head, but when you start combining them, they can make identifying network sites really easy for Google. -WordPress. There is no doubt it is a popular platform. It is the most common platform people are using to build their networks on. You do not have to avoid it completely, but if you combine it with these other possible footprints, you might be drawing unwanted attention. -Text logo/header. Most people just use the default text style header in WordPress. They are not taking the time to design a graphic header instead. -All posts are on the homepage. Owners of private networks get greedy. They want to squeeze as much link equity out of each site as they can. To that end, they put every one of their posts on the homepage of their sites. As a result, all of their external links are also on the homepage of the sites. This is also common when someone is selling links on a network. -Sample Page and Hello World post still exist. This is specific to WordPress, but I cannot count how many times I have stumbled on a network site where the default Sample Page and Hello World post are still published. A real site that someone cares about is not going to have those (usually). That is just carelessness. -About page. Common network sites often do not have an “About” page. -Privacy page. See above. -Every post has at least one external link. Now there are some legitimate websites that follow this pattern. Many news sites have links within just about every news story. However, when you combine this with the majority of the footprints in this section, it is just one more ding against you. -No social activity. Most network sites have no Facebook, Twitter, Instagram, etc. site attached to them. Think about real sites you come across and how few do not have these things. -Wide range of topics. This is another one where there are some legitimate websites that cover all sorts of topics, but many of the public network sites out there do it in an extremely jumbled way with no real organization or reasoning behind it. -No internal linking. Outside of a navigation menu and things like “recent posts” widgets, network sites commonly have no internal linking between posts. That’s a huge mistake. -Having the same external link profile. I have uncovered private network sites that are all linking to the exact same sites. How odd would it be to find 25+ sites that are all linking to the exact same 3-4 websites? Vary up your external link profile. Camouflage it. -Blocking robots. This one is a little more controversial. I know a lot of people who build private networks like to block spiders from places like Semrush and Ahrefs. To me, this could be a footprint Google could use to identify network sites. There might be a very valid reason for blocking them in some cases, but you show me 20 sites that all link to the same money site and all are blocking bots from common backlink indexes, and I will bet you $1 you just found 20 sites of someone’s network. -Contact information inside cPanel. Here is one most people are not aware of. When you sign up for a hosting account, by default the email address used for signing up gets plugged into the contact information panel inside of cPanel. Change that email to something random. This contact information gets published publicly. If you do a search for an email address inside of quotes that you have used in multiple hosting accounts, all of your domains can be pulled up that way. I once found 300+ network sites owned by the same person this way. Again, for the most part, these footprints on their own are not a big deal (other than the last one), but if you take 7 Wordpress sites, with text logos, no About or Privacy page, no internal links, all linking to the same sites, and they are blocking Semrush, Ahrefs, Moz, and Majestic… Well, you likely found yourself someone’s (not so) private network.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.