Why White Hat Linking Often Fails

By | March 1, 2019

White Hat links? An unusual topic you might think. But a vital one for anyone who does digital marketing gigs or any work for “bricks and mortar” clients.

The issue under discussion here is the need for business transparency with clients, now it looks like I’m about to pile into tiered linking. Maybe the rumours are true and I’ve gone completely “White Hat” and perhaps even been employed by the Moz brigade to spread their stodgy barely effective methods and sell their overpriced analysis tools?

white hat linking often fails
Why white hat SEO is often fails

Don’t worry, I haven’t sold out. I’m not here to praise White Hat links.

Really, that isn’t what’s going on. Today is just a short post looking in particular at the traditional (well traditional from 2009) 3 tier pyramid linking that is still popular today, and in particular playing close attention to the 3rd tier.

The Purpose of An Indexing Tier In White Hat Tiered Linking

The Trickle-Down Theory Of Tiered Linking

The “trickle down” theory of tiered linking. Adding power to your site – or just a dripping tap that is likely to create problems later?  How’s that for a tortured metaphor!

Direct Links

Tier 1. The first tier is pretty clearly defined. Direct links to your site – these pass both “link juice” (a term I’ve never liked, but let’s go with that for now) and Page rank, should anyone in internet marketing outside of site flipping still think it is a metric with any real value. So tier one offers a clear and measurable benefit to your website. Check

Tier 2. is about two things. Firstly, getting tier one found, and secondly about boosting those direct links and increasing their authority and ranking potential. Pretty simple really. Check

Tier 3. more descriptively called the “indexing tier” is about spraying tier 2 with trash links to ensure that they are found. While there might also be a small ranking and authority benefit from doing this, it really isn’t a concern, after all, these links are too far removed from your site and often too poor in quality themselves to really add any authority. The indexing tier is all about gaining quick visibility, nothing else.

Just as an aside. My own method of offsite promotion has not used an indexing tier for a long time now.

But Might You Just Be Highlighting Your Mistakes?

Now, there’s a catch here. Maybe it’s just me, and the evidence I’m basing the forthcoming ideas on are slim and not as adequately peer reviewed as I like. I’m putting this out there as an untested theory based on a small set of observations, but also as just food for thought. Let’s take a look at the shaky axioms I’m basing this on first. Ready?

The search engines have been gathering data regarding what is a poor link and what is a good one for some time . Here’s urban definition of white hat links that sums up the situation.

Google in particular have data from sources such as reconsideration requests (now tens of thousands of them over several years) to build a profile of what a poor link structure looks like. I’m sure that under that description there is no such thing as true “White Hat” every site is a shade of gray

Link velocity has long been an issue of concern for sites that webmasters are looking to rank for the long term.

Google are “ban happy”. They don’t really care if the links that point to your site that trigger the ban were self made or made as a result of genuine viral content being spread (a false positive penalty – these happen), they don’t care if these links were made by a third party engaged in negative SEO. To put it as bluntly as possible. They just don’t care.

Bad links pointing to site =  probably a penalty. After that they wash their hands of every other aspect.

Here’s the question for White Hat linkers;

Are You Certain Every Second Tier Link You Have Created Is “Safe”?

I’m going to assume that you have cherry picked your direct links. Obviously there is room for errors here, but let’s say you are careful to create your first tier. Do you put that same attention into your second tier. Bearing in mind that it’s probably got at least 10x the quantity of sites used to link from.

Do you know if they are all safe. Even if you think they are White Hat? Are you happy that as an overall profile they appear good?

Because, and here’s the thing we really all should have learned by now…

The Search Engines Play “Follow the Link”

While some may disagree with this, and it is yet to be 100% proven, from where I’m sat Google in particular have followed link trails going back to the demise of the “link ring” 5 years ago, at least to the extent that they understand tiered lining and, where large numbers are concerned, will build a meta profile of your site’s link structure to see whether you’ve been “naughty or good”.

Do You Remember Google Caffeine?

In case you don’t, here’s a reminder

https://www.searchenginejournal.com/google-algorithm-history/caffeine-update/

A couple of years later Caffeine was given a running partner

Hummingbird

https://www.wordstream.com/blog/ws/2014/06/23/google-hummingbird

Caffeine is URL indexing on steroids. It would be wrong to think of these updates as isolated events, they are “process drivers”. In other words part of the on-going drive to index the good stuff more often and to give a more precise and granular set of results to those searching the internet

In short, if a web page is any good, it is more likely to be crawled and indexed over time, and more likely to be understood contextually.

The first part of that is replacing the need for a third tier

The second part suggests that a poor and none contextual third tier might do one of two bad things.

The Double Whammy

Show that the links are out of context. Come on – who here insures that every indexing link is niche relevant? Who actually properly administers their third tier links?

If you don’t you are probably linking from totally irrelevant sites.

If you do, I’m going to stick my neck out and say you are wasting your time.

What To Do Then

Couldn’t be simpler. Let Google do the work on your second-tier links. They will be found over time (natural variation of link indexing is, in my humble opinion, a good thing).

If Google doesn’t index a particular link, there is a good reason for it. Chances are It doesn’t like the site it is on, or the site it is on has no authority and is considered not worth indexing.

Either way you lose very little – probably nothing, and might possibly benefit from not having a poor quality second tier link continually pointed out to the search engines by it being hammered with the indexing tier.

The downside, well it will likely take a little longer to get indexed and for the full benefit of your offsite SEO to take effect. But I’m guessing with the speed and thoroughness of SE crawlers these days we are talking about a delay of a few days at most.

What do you think? Is it time to completely ditch the indexing tier?

Originally published by Paul Rone-Clarke on demondemon dot come

4 thoughts on “Why White Hat Linking Often Fails

  1. Kev Genius Marketer

    Hi Scritty. Thought I’d say hello. Remember i did the proof reading for Blue Hat Demon a few years ago? I dropped you an email via the address on the contact page but you didn’t reply. Can you take a look at it and hit me up on Skype. I’ve got an idea for a JV for you

    1. scritty Post author

      Hi Kev. Found it. Straight to spam folder (sorry) you’re address is white listed now. I’ll hit you up on Monday 25th if that’s ok? Might have been your email address. “Genius Marketer”. When did you go and buy a vainglorious hat like that?

  2. Devin Abberlot

    I followed White Hat methods for years for myself and some clients. If you stick the the exact rules that sites like Moz tell you in a competitive market. You will fail. They do not work. Period. You look at your competitors with thousands of high value links and 20 years of website age. you are not going to catch those by being clean and good and following the rules. Even if you r product is massively better no-one will ever know becasue Google will hide you. Age and links play a massive part in SEO ranking even to this day and if you think following Moz is going to get you anywhere when your competition is well established you are fooling yourself and wasting a ton of cash on a Moz service that is little more than a rank tracker and content analyser for your trouble. You can make your own Moz for 20 bucks a moth in subs and some common sense. It’s Moz that is the con. Selling snake oil. The lie that you can succeed without going hard and taking a risk You have to go hard against strong established competition. You have to take a risk and Moz denies this truth

    1. scritty Post author

      Bravo and yes – you are absolutely right. In established niches with strong competition you need either;
      a) A massive marketing budget and several years of patience or
      b) To go hard and ignore much of what Moz say

Comments are closed.