I was at a great event hosted by Linkdex yesterday and I got talking with Kevin Gibbons from Blueglass about how our research into spammers has led to how we as an agency think and plan around ranking on Google. Talking to Kevin helped articulate the hypothesis, so I figured it was worth sharing with you.

Spammers Prove Hypotheses about Google

You can’t discount them, because they rank. So the question is why they rank and hold those rankings. There is a consistent pattern:

  • The site usually has hidden links, they don’t have exact match anchor text, typically just the url of the site as anchor.
  • The linking sites have typically no content relevance, but are authority sites i.e. a school gym cub,
  • Country location doesn’t seem to matter, so one spammer has a lot of indian university sites.
  • To rank on keywords like ‘online casino’ it takes them about 4-6 weeks and about 150 referring domains.
  • The links are usually on the front page of the donor site.
  • The most successful spammers use a vulnerability on a Joomla plugin, so the site owners never know the link is there.
  • The spam sites are usually small and the most successful ones seem to be just 1 page sites with tight relevance
  • And a spread of maybe up 50 keywords according to SEMrush and SearchMetrics.

Our conclusions suggest

  • Relevance of the ‘donor’ site is irrelevant.
  • All that seems to matter is the ‘authority’ of the site. A good metric for authority is Majestic Trust Flow, so anything over 30 is good.
  • Avoid exact match anchor text, your URL or brand is fine
  • Google don’t seem to track visibility of links i.e. hidden works fine
  • Links closer to the ‘header’ of the page are better it seems

All of this seems very simple. And it is.

At scale Google is fantastically complex, but on a granular level when it comes to links, it all seems…too simple. (head scratch). But then if you look at complexity theory, often the individual parts of the ecosystem are simple, it just gets super complex when you zoom out.

Look at a flock of birds up close. One bird wants to avoid hitting another bird, but it wants to fly in a flock + Randomness + Wind = extraordinary complexity. Youtube video. Conceptually the linking ecosystem is similar. Anyhow moving from from flying birds…

As an aside, we think the base of links that provide useful  ‘votes’ is reducing slowly. There are so many instances of sites who have spammed in the past who are gradually losing rankings over time. And this can be put down to a gradual reduction of  page rank from those spam donor sites. By the way we’re not talking about sites hit with Penguin because of exact match anchor text de-rankings.

So we think Google looks at links in a very simple way. But that doesn’t make things simple.

What Complicates Rankings is… Humans

Websites with sufficient authority have human curatorship i.e. someone is responsible for allowing content on to the site. If there is a high authority site on nappies, the owner is not going to accept a link to an online casino site. Therefore you have a naturally self organising ecosystem around relevance. Site custodians on ‘authority’ sites will only accept content relevant to them.

You then get into this whole thing of great content, being a calling card to get placements on great websites, leading to great rankings. Of course the spammers shortcut all of this by hacking into sites.

By Google discounting the ”voting capability’ of sites without ‘human curatorship’, they can let the human ecosystem do its job and get a less gamed set of search results.

Once you break apart ‘the algo’ and human stuff, it all becomes far clearer.

How We Uncomplicate Things

The consequence for our clients is we can look at the data, prescribe a plan of action i.e.

  • You need this number of links
  • Of that metric
  • Over this period of time
  • To these pages

Set up on onsite plan i.e.

  • This text
  • In this way
  • On this site structure

and then get really creative working out how to get the best placements on the best sites possible.

Since relevance isn’t an issue, it’s about thinking laterally. For instance, ‘online casino’, it’s about odds and probability. Who likes this kind of thing that isn’t in gambling? A university? Maybe someone analysing probability. A PHD student doing a thesis on probability. Maybe we sponsor the student to blog about the probability variances between online and offline casinos, or perhaps do a thread around ‘how long does £20 last offline / online casino , or maybe just do a story about an underwater casino  ref http://news.bbc.co.uk/1/hi/business/4559531.stm . It really doesn’t matter what the content is about, as long as it’s ‘socially acceptable’.

The main point is site relevance is irrelevant, anchor text is irrelevant, content from the donor page is irrelevant. The only thing that counts is whether the site owner accepts your link and whether Google accept you are not spamming them. Herein lies the most challenging part of hard core SEO.

Nick Garner

Nick Garner

Nick is founder of 90 Digital. Previously he was head of search at Unibet and prior to that search manager Betfair.