Jan 4, 2016

Using Data to Prioritize SEO Activities

You can never expect what someone might type into Google to find you! Instead of selecting an arbitrary number of primary keywords, a modern SEO will factor in an unlimited number of search terms and prioritize them based on opportunity. The last remaining place with keyword data is Google’s Search Console, which allows you to export search terms, impressions, CTR, and average position for your pages.

CTR data for dejanseo.com.au

In its raw form this data is not that useful, but it can be treated in such way so that it offers truly actionable insights.

Prioritizing Work

One of the most challenging things about being an SEO is that you’re expected to deliver results which are often identified with increase in ranking for a number of search terms. This is tough because an SEO doesn’t control the search results – a search engine does. But we can do one thing – structure our campaigns in such way that they focus on realistic goals.

Working With Data

If we observe the Search Console Data export and filter queries to show only the average position 1 (select range from 1 to 1.4) we start to notice a common CTR range for position one. This could be anything from 50% to 90%.

What we want to do is get the average CTR for each position range on our site. Knowing those values would allow us to model the data and create scenarios.

For example:

  • What would happen if all my keywords rise by one position in Google?
  • What would happen if keyword XYZ moved to position one in Google?

This type of modeling is useful during campaign design, particularly while prioritizing work and deciding what to focus on first!

Remove The Brand

When determining average CTR for an average position for your site filter out branded queries. They often have values significantly higher than other types of queries.

Remove Low Click Instances

In order to improve the accuracy of your prediction it’s also wise to remove any instances of low clicks (e.g. 1 click, 100% CTR) as they’re not statistically valid and can distort otherwise meaningful CTR values as much as inclusion of branded terms would.

High Gravity SERP Features

One other feature that tends to bias user choice in search results is presence of “high gravity” SERP features such as star ratings, answer boxes, knowledge graph, images, videos and local results. Plain results will see significantly lower CTR’s if placed near visually more interesting competing results. At other times low CTR could be due to another plain snippet, but the one that belongs to a bigger and better known brand. In situations like that, there isn’t much that can be done. This is still a good thing to know as you’d be less inclined to attack a high difficulty term and waste precious time and money content and off-site tactics which aren’t likely to bring results.

CTR Based Snippet Optimization Opportunities

Comparison of phrase-specific CTR values with the site average CTR values can provide valuable insights into snippet optimization opportunities. For example, if a keyword’s CTR under performs on a certain ranking position (e.g. 10% on position 1 with expected average of 30%) this could present an opportunity to implement schema and trigger rich snippet behavior or simply improve title and meta description in order to increase clicks for that page. Similarly, you could discover one of your results that outperforms the rest and learn about what it is about that snippet that attracted so many more clicks to it than expected.

Calculating Keyword Potential

Knowing your CTR’s means you can predict traffic on each position ofGoogle’s results, not by some vague industry standards, but specific to your own domain.

Working out the exact number of clicks that a certain phrase would attract on a certain position is not as simple as applying the average CTR to various positions. Earlier mentioned norm-deviation must be considered to avoid overly-optimistic or pessimistic predictions.

For example if the keyword on the position #10 gets only 1 click instead of the expected 3 clicks then our prediction for the top spot can’t be 33 anymore:

Reducing the expectations by ⅓ would be a sensible thing to do in such a scenario. Likewise, if we have a page whose CTR outperforms the expected value then we can apply a more optimistic calculation for it:

This whole exercise is meaningful when performed on a large data set of course and it can save many hours of work and decision making. Now let’s talk about money.

Financial Impact

Knowing how many extra clicks each keyword would bring is great, but in most cases, management only cares about the bottom line. So the next logical step is to work out the traffic increase and apply your conversion rate and goal value to it:

“Ten extra conversions will bring $1500 in additional monthly revenue” sounds much better than “we’ll get 200 extra clicks” next month!

Competition

Using the above mentioned method you may produce a list of most lucrative search terms and the impulsive thing to do would be to attack them top down in the order of priority. Implementing keywords in page titles, content areas, internal links and earning links for that content. This however could be a complete waste of time unless you consider term difficulty first. If you’re a #2 result for “online books” and #1 is Amazon, can you really beat them to get the position #1 to earn that extra revenue? Probably not.

I use Majestic’s API to draw FlowMetrics into my spreadsheet in order to understand the challenge in addition to opportunity. FlowMetrics are basically a form of PageRank, it’s a metric which reflects inbound citations/links and is a measure of page’s strength and authority (let’s say it ranges from 0 to 100). If you’re a 30 and your competitor is 40 then there may be a chance there. These are rough calculated values after all. But if you’re a 20 and the competition above you is at, say…. 80, then maybe you should pick some other target!

A good practice is to balance the click potential or revenue with difficulty and have a single score, call it phrase potential or something like that and sort your spreadsheet by highest value and prioritize your content and technical optimization work based on that.

Phrases to Pages

The next step is critical when it comes to prioritization. Map your keywords to corresponding pages on your site and add up potential of each keyword that leads to a certain URL to that URL’s page potential score.

Of course, data can only take you so far. One you produce your high priority list, observe presence of those pages for each significant keyword manually and ensure that there aren’t any surprises or situations that weren’t predicted in your data model.

Process Automation

All this can be done in Excel, but there is also this free little tool that automates much of this process while allowing injection of non-ranking keywords into your reports from Google’s Keyword Planner.

Final Thoughts

It’s not good enough to be “good enough” anymore. In order to earn its links and rank well, a page must be truly outstanding in every way – content, form and function included. Creating this rich, high quality experience takes time and most certainly a lot of money.

The above framework will be your final step in prioritizing your SEO work and will allow you to focus on pages that truly matter.

About Dan Petrovic

Dan Petrovic, the managing director of DEJAN, is Australia’s best-known name in the field of search engine optimization. Dan is a web author, innovator and a highly regarded search industry event speaker. In addition to industry leadership, Dan also maintains an active academic life as an adjunct lecturer and the chairman of the Industry Advisory Board for the School of Marketing at Griffith University. Connect with Dan on Twitter, Google+ or LinkedIn.

Upgrade to Power Membership to continue your access to thousands of articles, toolkits, podcasts, lessons and much much more.
Become a Power Member

CPD points available

This content is eligible for CPD points. Please sign in if you wish to track this in your account.