(Updated 24 August 2015) 

Preface

I’ve been talking about engagement for quite a while and I’ve never actually put together all of my thoughts in one place. Our clients who pay attention to this stuff are prospering from it, whilst the ones that don’t, just keep shovelling money into links and never seem to get that far.

The odd thing about engagement as a ranking factor is that it’s so intuitive and makes sense on so many levels. By rewarding websites that are engaging, it improves the whole Internet and improves the user experience for Google. At last, were seeing the end of profligate link acquisition and rise of great websites being rewarded with rankings.

With Google, there are hundreds of ranking factors, so it’s difficult to talk about all of them in this context, however broadly one can say there is

  • On-site optimisation – making a site easily understood by Google
  • Page rank – the patented link based algorithm that Google use
  • Trust – the aggregated confidence Google has in your website

Engagement is part of the trust ‘bucket’. By aggregating engagement feedback to Google over time, a domain accrues trust and therefore is more likely to rank because Google has confidence in its ability to satisfy users.

Pillars to this hypothesis

  • the research done by Dan Petrovic on engagement and rankings
  • domains who acquire very few links, have virtually no on-site SEO and are ranking on money phrases.
  • examples of successful click through rate manipulation from 2014
  • domains who are aggressively acquiring links but not ranking
  • data from Search metrics  ranking factors 2015 report, showing the correlation between click through rate and search position
  • domains who acquire very few links and are ranking
  • 4 different 90 Digital clients have followed exactly the same sequence of events and subsequently ranked in competitive phrases:
    • fresh trusted domain
    • very strong / moderate quantities of links
    • tight focus on target phrases
    • engagement optimisation (making the site more clickable and decreasing bounce rate)
    • rankings on competitive phrases within a relatively short period of time.

Spanner in the works

On August 12, 2015,  Bartosz Góralewicz  wrote an extremely well researched post on CTR manipulation failing for him. (If you care about the engagement argument, it’s worth reading the post in full. )

He meticulously set up his test bed to create very ‘real’ fake traffic, which subsequently got picked up by Google analytics, trends, keyword planner and search console.  Despite all of his work, once he released the click got into its targeted search results, there was no effect in rankings over a two-month period.

This is a very different result from a test he did in 2014, where he managed to manipulate search results very easily.

Upshot from him: CTR manipulation doesn’t work any longer.

“More and more data seem to support this experiment’s conclusion that click-through rate does not impact search engine rankings. In fact, at SMX Advanced, Gary Illyes from Google confirmed that “Google uses clicks made in the search results in two different ways — for evaluation and for experimentation — but not for ranking.”

So what does this prove?

To me, it says Google have made it very, very difficult for spammers to manipulate rankings through artificial click through rates. But, it doesn’t say to me that click through rate, as a ranking signal is dead. Why? Because in June 2015 Rand Fishkin did a crowd sourced experiment, which moved the search results as a direct consequence of click through rate.

Here:


so before reading on, yes CTR manipulation is not going to work. And yes, it’s still a meaningful feedback signal albeit one that is very carefully managed by Google.

Engagement as a ranking mechanism. 

Hypothesis:

Natural search rankings are affected by click through rates and subsequent engagement on site.

Google expect to see a certain click through rate per position for a search result. If the click through rate is higher than expected, the page on that search position will go up a ranking. Conversely, if it gets lower than expected click through rate and subsequent engagement, it will go down a ranking.

This chart explains the cycle and shows search results for ‘best casino’ between June 10th to date, what you will notice is how much flux there is in rankings outside of the top three.
image07

 

The formula seems to be:

Click through rate over ranking position, against expected click through rate for the ranking position.

Ie 6th position = expected ctr of 3%. A page gets 5% of overall CTR, Google raises it up a ranking position. If the CTR rate stabilises, the position will hold.
To help visualise this, Google give us this interface within search console showing us click through rates per search position for particular key phrases.
image08
In order to get engagement data, Google has to audition websites in search results where there is adequate traffic for them to make a judgement on whether a page is engaging or not.
This is where links and trust in a domain play their part.

Links.

If a domain has no history, Google can neither trust or distrust a website, therefore if it acquires good links (high page rank from trusted domains) Google will rank that website on relevant key phrases.

The site is ‘auditioned’ ie ranked and has passed the ‘engagement test’, it may well rise in rankings even without links.
A domain accumulates trust or distrust over time. This is why fresh domains are so much better than older domains which have a chequered history and therefore are distrusted.

If the domain is distrusted, it has to acquire far more page rank to be re-included in search results and effectively re-auditioned.

Where there are competitive key phrases and a little brand differentiation, the differentiator appears to be volume and quality of links. In other words, Google appears to have a ranking plan ‘a’, ‘b’, ‘c’ . So if it can’t rank of relevance and engagement differentials, it will backstop to links.

Background.

Over the last year, I’ve seen too many sites which are climbing rankings on what appeared to be few or no good links.
Here are three examples where search results have been manipulated by affecting click through rates to particular pages.
Two of the experiments involved Rand Fishkin asking followers to seek out a particular page on a given search result.

This is Rand talking about engagement as a core factor in his whiteboard Friday session:

Another very good roundup of engagement affecting rankings:

and Bartosz Goralewicz doing some negative click through rate manipulation with a bot that he had built:

This very interesting experiment in 2014 by Cesarino Morellato and Andrea Scarpetta.

Other supporting information

Every year a German company called SearchMetrics publish an extensive report analysing their dataset. They have one of the biggest SEO related datasets of any non-search engine business.
SearchMetrics base their insights on correlations within their datasets. Therefore what they say has a lot of weight.

Highlights are:

  • Click through rate and search result positions – the highest correlations they’ve ever seen on any metric they have analysed
  • Links still correlate highly with rankings, but good links are now far more important than ever before.
  • Google is getting far more clever and understanding content.

You can download the full report here

Their comment on CTR rates:

“If the correlation for “click-through rate“ was part of the overall bar chart, it would be right at the top. This is the highest correlation that has ever been calculated for a factor of ranking studies. The basis for this analysis is the average CTR and we had to process the data differently than we do in the Searchmetrics Suite.

For the calculation of SEO Visibility and other metrics in the Searchmetrics Suite a dynamic CTR curve is used, which is quite complex. For the present study, the average of all CTRs was analyzed, even if the CTR per keyword and search intention were quite different.”

In detail, the curve looks like this: (2015 report)

2015-08-14_185115

Searchmetrics also looked at another important engagement factor, which is time on site. (2015 report)

2015-08-14_185347

I see time on site as a fairly important factor because it represents engagement. From Google’s point of view it’s relatively easy to track because they can see from their search results data which pages people have spent most time on.

In my opinion, the question is more about how fast they bounce in and out of the page. If they jump into a page and come out within three seconds, that’s a negative signal. But if they dwell on a page for more than 20 seconds, it’s an indication that a user has read a page and moved on. In other words they are satisfied with the content.

Links

Searchmetrics say:
The lower correlation for the number of backlinks is immediately noticeable. This had fallen from 2012 – 2013, however, link quantity still has a high correlation with good ranking. An examination of how the other correlations have developed will form part of the following sub-chapters. (2014 report)
image00

This correlates with my own findings were I’m seeing a massive disconnection between number of links and rankings.

Searchmetrics say:

The source of any given link is highly important and relies on the following factors shown in the graph below: (2014 report)
image03

Again, this correlates with my experience where we have focused on acquiring links which have very good page level metrics and subsequently rankings have improved dramatically. By the same measure in the past where I’ve acquired links that just weren’t particularly strong, rankings wouldn’t move.

Just to qualify what I mean by good link:

  • The page on which the link sets should have trust flow, this is a majestic metric that is commonly used in SEO and correlates fairly well with rankings. It’s also a good proxy for Google page rank.
  • Since link acquisition is mostly about page rank aggregation, it’s important the domain has at least 10 referring domains which are linking with follow links.
  • If a link comes from a massive domain like Huffington Post, that isn’t necessarily a good thing because the trust flow on a given page may be tiny. Although saying that, I would always value a link from a strong domain more highly than from a week domain even if the trust flow from the page on which the link sets is relatively small, i.e. TF 2.

Average ranking over number of backlinks (2014 report)

image02

Average ranking over number of backlinks

My comment on number of backlinks: this is just a headline statistic, one has to dig deeper into the data and look at the ratio of follow to no follow links. Also one has to account for links that have been disavowed. If you don’t have access to their disavow file, it’s impossible to really say exactly what link equity a domain has.

Therefore if I’m doing link analysis, I’ll look at all backlinks and filter out no follow and links that have been acquired more than six months ago to get some idea of how active they are on link acquisition

Share of backlinks from the same country  (2014 report)

image04

Share of backlinks from the same country

Based on my experience with black hat SEO’ers, this seems to be very little correlation between location of links and rankings per country. Spammers often use a link service called Sape. These sites which use Sape links are often very indiscriminate when it comes to geolocation and yet they rank.

Also, to support this idea of link location being irrelevant, I did a big piece of research, scraping 53 English-speaking territories on Google, looking at 60 different phrases, going 20 search results deep.

An overwhelming pattern emerged; if you rank well in a competitive territory like the UK, that domain will rank equally well in every other country as long as the language is the same. I.e. English-language.

If a phrase is very competitive, there will be greater variation per country, but largely I’m seeing location of link not affecting rankings.

Link quality  (2014 report)
image05

Average Link Age  (2014 report)
image06
Searchmetrics say :

Examining the average link age results in a continuously decreasing curve with the y-axis describing the age of the links in days. Better ranked URLs have, on average, older links.

 This is an indication that better-ranked pages in SERPs have been in existence longer. This correlation is related to the correlation of the number of backlinks. Those sites with more links also have more old links.

I also believe this. I have been looking very closely at expired domains and how certain ones seem to have far more ranking power than others. One of the common factors is the average age of the live links to that expired domain.

Pages which have virtually no ‘SEO’ and yet rank extremely well….
https://bitcasino.io/ ranks all the way across the top bit coin casino phrases. The only accessible content is the title tag. The domain has never aggressively acquired links.
https://www.matchbook.com/ ranks across most of the core exchange betting key phrases and yet the only accessible content there is the title tag, again this domain has never aggressively acquired links.

https://casino.betfair.com/ ranks on some of the top online casino phrases in the UK, yet it doesn’t have any content on the homepage to speak of except for its title tag. They have been acquiring links, but not in the usual out and out way that you see so often in casino.

In e-commerce, I’ve seen single pages rank for up to 320 words. Yet it’s just a plain old e-commerce page with a grid of items for sale.

This page ranks for around 190 different phrases in the US according to SEMrush: http://www.rarepink.com/diamond-engagement-rings

And this page ranks on 358 phrases in the US according to SEMrush : http://www.gemvara.com/b/mens-wedding-bands/

Another site, http://bigfreechiplist.com ranks across a large number of competitive affiliate centric phrases in online casino and yet the page is just a flow of new bonuses… It has very few follow links of any quality and yet ranks.

Having studied all these domains very carefully, a couple of things have come through very clearly,

  • Pages with the most links don’t necessarily rank on the most important key phrases. This tells me the effect of links has changed from being a direct ranking signal to being an indirect one.
  • And users have voted with their engagement and Google has decided to trust these pages and rank them for whatever they can, despite almost no on or off page SEO. (Betfair is the only domain where there appears to be a very structured link acquisition process)

If engagement is a deciding factor in rankings, then what I’m seeing are pages which fulfill their purpose perfectly in relation to the keywords they rank for. All of these sites are great examples of this idea.

Google quality rater guidelines

Another supporting point to this whole idea around engagement is how Google have been training their algorithm to identify domains that are trustworthy. Google quality rater guidelines 2014 has about 70 pages of instructions on how quality raters should score pages and websites.

Incidentally, even this is a great example of engagement being a deciding factor on rankings. Number two on Google code UK for Google quality rater guidelines 2014 is the direct link to the PDF document itself. PDFs are not known for being SEO friendly… Yet why does it rank? Can I suggest it’s because users seek this out and Google has recognised its importance.

And for an ‘experiential’ validation of engagement and rankings,

Linkdex have launched a fun little game where you choose between one site and another to guess who ranks the highest. I’ve played the game several times and on average I’m getting around an 80% success rate. In other words 80% of the time I’m guessing the page which gets the highest rankings. Once a very interesting is how the better page seems to get the rankings.

It is just another validation from my view that Google have finally got to a point where they understand the search results people actually want.

It’s fun to play, so do have a go…Linkdex SEO instinct

2015-08-26_102949

 

Wrapping up…

Even though this Searchmetrics report is from 2014, broadly the correlations fit very closely to my experience.
In my opinion…

  • indiscriminate bulk link acquisition doesn’t work anymore (Penguin)
  • links have to be carefully acquired from trusted sources
  • a site has to be engaging, otherwise it will tank in rankings
  • once ‘engagement’ kicks in, links become relatively unimportant

Finally, I’m seeing Google become a search engine which rewards the quality and usefulness of content above the volume of links. Of course, links are important but today if you ignore user experience and the usefulness of your website, you will fail.

The irony is when you do everything in the right sequence, ranking is so much cheaper than it has been for years, simply because you will need fewer links when you get engagement right.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.