There’s been a lot of discussion about CTR as a ranking factor with Searchmetrics’ 2014 ranking correlations and factors study listing CTR as a top factor.
Many SEOs have believed this for some time and although some disagree there have been a number of studies to suggest this is the case like Rand’s successful crowdsourcing case study.
When this happens you can bet there’ll be SEOs that’ll try to game the system through the use of click bots, micro workers, crowdsourcing, or more creative strategies (years ago I even received a free chocolate bar).
The problem with automation
Bartosz Góralewicz recently published a case study disproving the effect of CTR through the use of click bots. While Bartosz’ experiment was thorough in that he excluded all other factors that could affect rankings and he managed to successfully fake a Google trend and search volume I don’t agree with the test’s conclusion.
He came to the conclusion that CTR is not a ranking factor because the rankings stayed the same and after two months started to drop. What his test showed me is that Google is very good at spotting fake CTR.
I’m not as convinced with this conclusion. Increasing the search volume of a keyword by almost 4000% in a single month is not natural and I think Google is able spot this. If search volume had gradually increased over time maybe the results would have been different.
Philip Petrescu did an in depth study on a users’ CTR behaviour on organic search. I would assume that Google have similar but more comprehensive statistics on this and it would be very suspect if in one month a single page ranking in the second position started getting a massive amount of clicks while all other results remained the same.
To me, this study proved that you just cannot game Google and if you want to improve your CTR you need to do it the natural way.
Star rating vulnerability
While I’m confident Google’s algorithm is advanced enough to spot manipulation there are still limitations.
Razvan Gavrilas last year outlined the many ways you can gain CTR with fake rich snippet data. Many of the examples on there were directly defying Google’s guidelines for using rich snippets and were consequently given penalties.
Over the year the number of SERP results with fake markup seems to have decreased (especially with author images being removed by Google). However, Google is still not perfect at spotting fake reviews.
The problem is that the markup needs to be added manually as outlined in Google’s guidelines but as long as you don’t outright disobey their policies you can game the reviews as Google don’t seem to validate whether the information you enter is correct.
What you need to stick to are:
- Add markup to pages that show the show reviews or the total aggregate rating
- Not have content that isn’t human readable
If you’re a service provider you can use ratings from your own site, trust pilot, feefo, or any other review site. An example of the markup that you can add as service provider is:
<div class="promo trustpilot">
<div class="review container" itemscope="itemscope" itemtype="http://schema.org/LocalBusiness">
<a href="http://www.trustpilot.com/review/www.site.com" target="_blank" rel="nofollow">
<div class="sbox" itemref="support-box-companyinfo">
<span itemprop="name">Company</span>client reviews</div>
<div itemprop="aggregateRating" itemscope="" itemtype="http://schema.org/AggregateRating"
class="rating" title="Company's TrustScore is based on 195 reviews.">
<span class="average" itemprop="ratingValue">9</span>out of <span itemprop="bestRating">10</span></div>
<div class="stats">Based on <meta itemprop="reviewCount" content="300" />
<span class="reviewCount">300</span> Trustpilot reviews
If you’re using reviews from a 3rd party you would add a link to the review page as I did above for Trustpilot. Like I said, Google doesn’t seem to validate this, even with that link.
I’ve seen this manipulated in different ways.
Not entering the correct information (or not updating when reviews take a turn for the worse)
Slater Gordon boast a great 9/10 rating from 255 results:
Here is the markup they’re using:
As you can see they reference Trustpilot.
When checking the actual Trustpilot reviews you see they actually have a 7.6 rating:
You can either manually enter a good rating value in the markup, or even if you start with a good one then start dropping you might not update it to the new bad result.
Manage your own reviews
Aviva have a great 4.6 review rating based on over 3000 results:
And in fact they really do:
Reviews hosted on a 3rd party site are more trustworthy than if they were on a business’ site. You have to wonder at what point and which customers are asked to leave a review and whether the reviews are moderated. You know that 3rd party sites don’t moderate their reviews and there are no restrictions on who leaves a review or at what time.
This is what Trustpilot says about Aviva (albeit only 61 people):
Pick the 3rd party site with the best reviews
Go Compare has great reviews from just a few votes:
At the bottom of the page they tell you where they’re getting this information from:
Which adds up:
But they don’t want to show you this:
Don’t base it off of user reviews:
RAC have a perfect score:
This is according to Defaqto:
Defaqto are an independent researcher of financial products and not RAC’s customers:
At our heart is the UK’s largest retail financial product and fund database – we maintain it by collecting data from across the whole market, and using our expertise and insight to analyse this data and make it comparable.
From this, we create a range of products and services – ratings, software solutions, consultancy services, data services, and publications and events – to deliver this information in a meaningful way.
Our intelligence facilitates better financial decisions and greater effectiveness in the creation, management and distribution of financial products.
In fact, many of RAC’s direct competitors share the same Defaqto rating.
RAC’s customers don’t agree with Defaqto:
In competitive verticals brands need to stand out from their competitors and having the highest approval rating of its customers is a strong trust signal. The brand with the best rating is likely to attract more customers but it can be frustrating if your genuine star rating it outdone by someone that isn’t being fully truthful.
Google’s algorithm is constantly improving to ensure they give searchers the best results. While the examples above are all technically complying with Google’s rich snippet guidelines they are nonetheless misleading consumers on their real customer satisfaction.
Much like the first round of spam reviews there will be a time where some of the examples above could catch Google’s eye.
To me the real issue goes beyond just Google potentially penalising a site, it’s what local governments have to say about the matter that could be the real threat to businesses.
The US and UK have been cracking down on Astroturfing – the process of generating fake reviews online – serving hefty fines to organisations that are caught in this practice. While none of the examples above are giving fake reviews they are still misleading potential customers with their inflated star ratings.