Google are finally turning off the supply of keyword referral data on natural search.

 

2013-09-29_080657

source: http://www.notprovidedcount.com/

 

Weakened Business Case

If you have worked in a large organisation, you will know, proving your case is critical to winning more budget. With 100% not provided that case is going to be a bit harder.

Looking at this from a business perspective, it’s relatively serious. You can still attribute a conversion to organic search, but you can’t definitively say what keyword it is. This makes the process of arguing for budget harder because you cant say to a HiPPO (Highest Paid Person’s Opinion) ,  ‘SEO/Keyword 100% got you that sale’.

Typically we are seeing a shift to SEO being assessed on rankings. Ironically Google have been getting far tougher on scraping rankings, so even that’s a challenge if you’re doing ‘do it yourself’ rank checking at any scale.

Background

Fortunately there are ways around the problem of solid attribution. No matter how you look at it there will always be a few pieces of the puzzle missing. So let’s go into how to set up a framework which can handle not provided.

In the past we had we had 3 main data points:

  • Rankings: Rank checking and Google Webmaster Tools (WMT)

  • Traffic to pages: The specific searches to a given page, which then allows you to track what keywords converted for you, allowing you to make decisions on what words to prioritise

  • Estimated search Volume / Click through rate per position


2013-09-29_083141

 

But now things have changed. Were missing one side of the triangle. Were now left with rankings and estimated traffic volumes and analytics data showing  ‘organic traffic’ by page.

 

2013-09-29_083211

 

Logically, you are now dependent on rank tracking. There are two sources:

  • Google webmaster tools which are moderately comprehensive
  • Your own rank checking (more detail later)

 

Webmaster Tools

Web master tools which will give you keywords, pages and click through rates, so there is enough information to more or less get by on external data, but for internal attribution i.e. internal on site analytics, you will not be able to 100% attribute a conversion to a keyword.

 

2013-09-29_084836

With Google, you are dependent on what they give you, which is fine when you are working on a low level, but if you are working to any real scale you need your own rank checking.

Google have an API which allows you to mass export WMT data into csv etc – more here on using PHP do extract the data

And whilst there are arguments saying rank checking is dead because rankings move about so much, the fact is that you don’t really know what keyword has won you traffic, so even though rankings are a weak metric, in this situation its your only metric to show what success looks like.

So for the purposes of this post, lets assume there is no such thing as webmaster tools data.

How to Think of Pages

The first thing is to think of pages as containers. Each container has a content theme and by having many containers linked together, you have structure and separation.

 

2013-09-29_082207

Search Impression Data

Because you’re planning ahead you will need to know what keywords to target and what sort of search impression volumes you will get when rankings for a phrase.  With WMT data, you have an idea of impressions per phrase, but not for phrases you don’t already rank for. So you are reliant on Google Adwords keyword data.

In the past you could get search volumes by broad and phrase match. No more. and you used to be able to separate desktop and mobile search traffic. Not any more.

Google say:

In general, you’ll notice that the average search volume data is higher in Keyword Planner as compared to the exact match search volume data you got with Keyword Tool. That’s because we’ll show you the average number of searches for a keyword idea on all devices (desktop and laptop computers, tablets, and mobile phones). With Keyword Tool, we showed you average search volume for desktop and laptop computers by default.

There is an excellent Moz post that goes into all of this in some detail. 

Did I already say Google were evil!?

Nevertheless if you assume all keyword search volume data is out, but all equally out, then you have some sort of relative benchmark to work off.

Moving on.

The Formula for Getting Potential Search Volumes for a Keyword

it is:

ESTIMATED SEARCH VOLUME multiplied by RANKING POSITION CLICK THROUGH RATE

Click through rate per position is an accounting question. You can  assume one search results page and the click through rate for those results. Optify have taken this approach

 

2013-09-29_092223

 

Source: http://www.optify.net/wp-content/uploads/2011/04/Changing-Face-oof-SERPS-Organic-CTR.pdf

There are two ways of looking at CTR, either one impression and on click or you can assume the idea of a session i.e. you do a search and click on a page, click back, click on another page, click back again, click on another page, click back again.

Roman Bębenista wrote a really good post on Moz.com about this where he took 14,507 search queries from Webmaster tools and interrogated them to come up with this idea of a session based CTR rate. As you see, the CTR rates are far more evenly distributed than the ‘one page, one click’ model. If you look at Brand CTR, it’s 306%! which totally goes against the usual idea of brand getting 80% of CTR

This also suggests that when Google say you have i.e. 1,000 impressions for a keyword, that its not unique search queries, but lots of repeat impressions of the same query.

Honestly I don’t know the answer to this puzzle . When you reconcile the two ways of accounting for CTR,  80% CTR on impression model  / 45% CTR on the Session model does this suggest about 50%  of impressions for a brand search are for the same query? I don’t know the answer.

Nevertheless, you will have to have a click through rate per ranking position and the Optify CTR rates using the Impression model will have to do.

Session based CTR’s per position for type of query:

 

1341615596_d1986cbc9b492ac56d1ca05ce20fbe5e 1341615594_e53cbfd84a73b91b86815cdfdeee2a97

Source: http://moz.com/ugc/click-through-rates-in-google-serps-for-different-types-of-queries

We now have relatively shaky search volume data and CTR per position,  buy hey, we are beggars and we take what were given.

Gather Your Keywords

2013-09-29_095018

A good way to work out what phrases are most useful (apart from Adwords keyword tool) is to use tools like Semrush or Searchmetrics. It gives you the main keywords a site ranks for and the pages that rank for those keywords. Whilst I love Searchmetrics, Semrush data is also good and far, far cheaper and their API is easy to use and powerful.

Going back to the idea of a page is a container for words, Semrush helps you build up an idea of how keywords cluster around pages. From that you can get an idea of how to organise pages around tight content themes so they naturally centre around certain phrases.

Here you can do comprehensive content ‘theme’ analysis on as many winning competitors as you like and create benchmarks for your own project.

The main idea in a world of not provided is to have accountability, so you need to know those pages rank for tight groupings of phrases.

So it’s really important to track lots of keywords so you can get a clear picture of what keywords rank for what pages. If you believe in keyword tracking, then we have entered the age of of bulk rank checking.

Organising Your Campaign

I am a massive fan of Linkdex. I love what they are doing and I love the way they are handling 100% not provided.

In their tool you can organise pages around keywords. So much like in WMT, you can see what keywords you are tracking, ranking for what pages. The screengrab shows you how its done.

linkdex1

Click for full size image

 

 

You can then tag up phrases around themes you are interested in, so in this case you might tag up phrases that relate to geographical area like Atlanta. Now you know how all similar content is ranking for a page or group of pages.

linkdex2

Click for full size image

 

 

And you can look at  a single page and see the tags groups for that page. If you have lots of different themes ranking on a page, then maybe you should reorganise the pages and make them more ‘theme dense’.

2013-09-29_101902

Click for full size image

 

You can also look at individual keywords and see what pages they rank for.

linkde5

Click for full size image

 

 

And by doing lots of rank checking i.e. daily or at least weekly you can get a good idea of ranking stability over time.

linkdex4

Click for full size image

 

Side note on rank checking:  We have done a lot of research into rank checking and if you want rank checking only, I like AWR Cloud, I’ve also been looking at getstat.com and if you are interested in a package, then searchmetrics.com is good, but for value for money I like analyticsSEO.com who incidentally are working on some massive scale and accurate rank checking technology, at a reasonable price, so watch them is you want to have huge scale rank checking. And of course there is linkdex.com who have really solid rank checking and allowances of around 5000 keywords per account.

And when you have lots of other data points around a page, it’s far easier to come up with a clear performance assessment of  your page.

2013-09-29_132147

Click for full size image

 

All of this means you can start to work in a structured way, mapping out pages, keywords, themes and even geographies.

Looking at the Distribution of Not Provided

Econsultancy did a useful post on how to work out the distribution of not provided per page in Google analytics. This is important because when you triangulate it with your other ranking data, you can get a good idea of how much traffic is actually coming in from your ranking keywords per page.

2013-09-29_102431Source: http://econsultancy.com/uk/blog/8342-how-to-steal-some-not-provided-data-back-from-google

Conclusion 

2013-09-29_132541If you follow this sort of methodology you will have a framework which allows you to plan out and organise your money pages so they work properly for you. In effect its like putting a jigsaw in place and being left with one piece missing. That’s ok because you have a strong data set, surrounding the missing piece.

Of course, you will have trouble identifying log tail phrases, but something is better than nothing.

If you look at the attribution scale, we have shifted from ‘very high’ as of pre not provided, to ‘reasonable’ and now to ‘OK’, with a lot of extra work. Thanks for the extra overhead Google 🙁

2013-09-29_140021

I leave you with one final thought on not provided…

2013-09-29_132611

Nick Garner

Nick Garner

Nick is founder of 90 Digital. Previously he was head of search at Unibet and prior to that search manager Betfair.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.