Mike's Tuesday Tips:
This week, best practices for URL Structure
First, make URLs descriptive and include keywords.
Many CMS systems like Wordpress by default will spit out URLs that look like this….
These are no help to search engines or to your visitors.
Make them descriptive
A visitor or search engine should be able to take one look at your URL and have a good idea of what the page is about.
Okay, maybe not that descriptive…
It’s over man. She’s not coming back.
Second, don’t be afraid of using categories and subcategories. I know there is a belief out there that shorter URLs are somehow better. This belief came from nonsense correlation studies with zero actual evidence.
Search engines have no problem processing longer URLs and you will find long URLs at the top of SERPs regularly.
Categories and subcategories can work like breadcrumbs to search engines giving them even more clues as to what your page is about.
As you can see in the images, search engines will display them in search results giving potential visitors even more reason to click on your listing.
Lastly, use hyphens to separate words.
Do not use underscores. Underscores actually join words together.
Do not leave it up to search engines to determine what your page is supposed to be about.
Is that Alanis Morissette’s hits or Alanis Morissette… something else.
One is a long list of award-winning music.
The other… is not.
Missed this bit of news last week.
So if you were actually any good at ranking pages and the stuff you had been preaching for nearly 20 years worked, this probably should not happen, right?
This wasn't a build it big and cash out for a big payday type of situation. Although, I hope all of those people who invested in Moz over the years finally saw a return on their investment.
This is going to be a long one.
Let's talk about some best practices for anchor text in your link building. Some of this is probably going to go against what you may have heard from some of the "gurus" out there, but stick with me.
First, I'm going to start with my own personal golden rule of anchor text. I have been following this for 12-15 years. If you get nothing else out of this post, but this one thing, you will be fine. Your anchor text should either describe why you are linking to a page or it should describe what the page you are linking to is about. That's it.
It's as simple as that. This goes for both internal and external links. In this way, your anchor text is both SEO-optimized and serves web visitors well.
Never intentionally use naked URLs as anchors. It is one of the absolute dumbest ideas I have ever heard. No, it is not "natural". Not one bit.
I think we can all agree that some of the most "natural" links you will find are links within the content of an article. Nobody links with naked URLs in articles. That was not at all common until SEOs started doing it.
If you were writing an article about how to build a PC, you wouldn't say something like:
Looking at their selection and prices, I would not recommend shopping for PC components at hxxp://bestbuy.com.
You would use something like:
Looking at their selection and prices, I would not recommend shopping for "PC components at Best Buy." (Where the words in quotes are the anchor text)
That is much more "natural" and makes way more sense.
You do not need to use naked URL links to make your link profile look natural. I have been doing SEO since 2003 or 2004. In all of that time, I can tell you exactly how many times I have intentionally built a link with a naked URL as the anchor text. Zero. Exactly zero. The only time I have done it is when I have been forced to in things like press releases or directory listings that had no other option.
I know many people will look at big brands and see that they have a lot of links with naked URLs as anchor text and believe that is proof that you need them, but what you have to remember about big brands is they do actual PR. A lot of that PR work includes things like press releases which only have an option for naked URL links.
If you really believe that you need links with naked URL anchors and I cannot convince you otherwise, there are plenty of directories out there that only allow those types of links. Save them for those.
Do not waste good link opportunities with naked URLs, and especially never do it with internal links. I did a consulting call yesterday with a business owner and their "SEO" who were looking for some extra help. Many of their internal links were using naked URLs for anchors. One of the dumbest things I have ever seen on a site. When I asked why, the SEO stated it was to make their link profile look natural.
If I could have reached through the Zoom call and throat-punched the SEO, I would have.
Should you worry about anchor text ratios? This might be an unpopular opinion, but I'm going to say no. Yes, I know you have probably heard how important anchor text ratios are, again, in making your link profile look "natural".
I personally have never found any solid evidence of this. This notion all came about right after Penguin was released. I know many in this group were not involved in SEO back then, so a little history lesson then...
When Penguin first released and many people saw websites tanking all over the place, mostly spammers, there was quite a bit of panic in the community. Nobody knew for sure what they had targeted (although the 750,000+ spam warnings that went out just a few months before in Search Console, Webmaster Tools at that time, gave us a pretty good hint). Within a few days there was an article published on a site declaring that they had figured it out and one of the main things Penguin was targeting was sites using high ratios of the same anchor text. If I remember correctly, it was a group called Microsite Masters, but don't quote me on that. And I don't think it was the same owners of that site today.
This article spread like wildfire. People were so panicked and desperate for answers at that time, most never bothered to question it.
What this article failed to account for was common practices of most spammers. Back then there were a lot more people using tools like Xrumer, SENuke, Bookmarking Demon, Sick Submitter, etc. to build massive amounts of low quality links. You could shoot out a few hundred thousand links in a matter of days with these tools if you knew what you were doing and had the right setup.
What spammers would do is if they wanted to rank for "lowest rate mortgages" that is what they would use for the anchor text for pretty much all of their links. If not that, then very close variations. It was not uncommon to see pages ranked with spam that had 75-90% of the same anchors in their link profile.
But was it really the anchor text that got them caught in the Penguin filter or was it the low quality links they were using? There were tons of sites hit by Penguin with much more varied anchors in their link profiles.
I still maintain it was the low quality links Penguin picked up on. The anchor ratios were just a byproduct of how spammers did things back then. I tested it many times using more varied anchor text and Penguin would still catch the sites eventually.
Let's think of it another way. What if you created the world's best online mortgage calculator and it went viral? What anchors do you think people are going to use to link back to that page?
I would bet that 90% or more of the anchors are going to be either "online mortgage calculator" or "best online mortgage calculator". Is Google really going to punish a page for that? It would not make any sense.
Now all of that being said (like I said at the beginning stick with me), I would recommend varying up your anchors. I have always done this (it goes back to my anchor text golden rule above), but not because of some worry about anchor text ratios getting my pages in trouble with Google.
I do it because I'm never trying to rank a page for just a single keyword. There are usually quite a few variations I am targeting. The other reason I recommend varying up your anchors is that by doing so you are giving search engines more clues as to what the page is about.
But I will never tell anyone that they should shoot for X% of naked URLs, X% of brand name anchors, etc. That just makes zero sense. In fact, doesn't shooting for some artificial percentages actually go against everything about your link profile looking "natural" in the first place?
Again, if you get nothing out of this admittedly long rant, just remember my rule for anchor text.
Your anchor text should either describe why you are linking to a page or it should describe what the page you are linking to is about. That's it. If you just do that, you will be fine.
I get asked a lot about what books or courses I recommend for someone to learn more about SEO. The best thing I can recommend is to investigate the SERPs yourself. Find high-ranking pages and sites and tear them apart to see what they are doing. The answers to most SEO questions are already in the SERPS. I'm all for testing things, but if you don't have the resources to do so, Google is already giving you the answers in the search results.
This was an idea I learned a long time ago doing just that and taking a look at how Wikipedia structures its pages. You can learn a lot about SEO by reverse engineering Wikipedia. Yes, they are a highly authoritative site, and yes, they get a ton of backlinks. However, what really takes them to the next level is what they do on their pages and how they structure their site.
As I mentioned in the thread last week, link placement on a page matters. There are a bunch of factors that determine the strength of a link, but a general rule of thumb is that link strength flows down the page like you are reading a book... left-to-right and top-to-bottom. Links at the top of a page are going to be stronger than links at the bottom of a page.
You can use CSS to take advantage of this, and that is exactly what Wikipedia does.
Wikipedia is a unique site where its main navigation really is not all that useful for visitors, nor does it provide any real SEO value.
Take a look at any Wikipedia page. https://en.wikipedia.org/wiki/Airplane!
You see their main navigation along the left-hand side as well as some links at the very top of the page for things like logging in or viewing the revision history of the page.
These links are pretty much useless for SEO.
So what does Wikipedia do?
Take a look at the text-only version of the page in the Google cache. The text-only version is how search engine spiders are really reading the page. http://webcache.googleusercontent.com/search?q=cache:https://en.wikipedia.org/wiki/Airplane!&strip=1&vwsrc=0
You will notice that they use CSS to layout those links where they want them, but in the code of the page and text-only version, they appear at the end of the page, nowhere near the top.
In this way, they put a stronger emphasis on the links that matter on their pages, the internal links in the content.
You can use CSS to do the same thing on your sites. Think about all the links on a site that are in the header for the user experience, but have no SEO benefit. Contact Us, Logins, click or tap-to-call phone number links, sometimes even ToS or Privacy page links... You can put these at the end of the page in the code, but then use CSS styling to position them wherever you want for the user, getting the best of both worlds.
There are some big misconceptions out there about how nofollow works, so let’s clear it up.
The first image is a simplified version of how a page passes on authority to other pages through its links.
Each page has a set amount of linkjuice that it can pass on through its links.
When nofollow was first introduced, it blocked that linkjuice from passing through links carrying the nofollow tag, and it would instead be redistributed among the remaining links on the page, making them stronger, as shown here:
Many of us used this for what was known as PageRank sculpting. We could control the flow of linkjuice throughout our sites to boost the pages we really wanted to rank.
Of course, Google didn’t like that, so in 2009 they changed how they handled nofollow.
Here is how it is done now:
Linkjuice still flows through links tagged as nofollow. It no longer gets redistributed among the remaining links on the page, but it does not get credited to the target page they are linking to.
This is why it is a bad idea to nofollow internal links. You are actually bleeding out linkjuice by doing so.
For some reason, people still think Google treats nofollow as illustrated in the first image, but that has not been the case since early 2009.
Then there was the update in March of 2020, where Google again changed how they treat the nofollow tag. Up until then, it was treated as a directive. With the latest update, they instead treat it as a hint or request. They make up their own mind whether to treat a link as nofollow or to ignore the tag. They will never tell you if you they are obeying the nofollow tag or not on any links, so we have no idea if a link is really nofollow or not.
They also added additional identifiers they want webmasters to use to identify sponsored links, affiliate links, etc.
Can you still sculpt PageRank?
Some of the min/maxers out there who really want to squeeze out every little value they can have found ways to sculpt PageRank even after Google changed how they handle nofollow.
The other way it was commonly done was to use iframes. Googlebot always skipped over iframes, so you could use them to hide links with less importance and sculpt PageRank that way. For years, the footer and parts of the header of Bruce Clay’s site used iframes to do this.
Google does seem to read the content inside iframes these days, although I have seen some tests where they did so inconsistently. This method could still work, but it’s just not 100% reliable.