US based Aaron Wall is quite probably my favourite SEO blogger, and has been since long before I started blogging myself (back in 2007).
Back in October 2009 I pinged Aaron and asked him if he would take me through the basics of keyword research for those visitors to my blog who are just starting out, and help me create a quick beginner’s guide to keyword research, as it’s one area I’ve not really touched on in my seo blog.
I’ve been tidying up my blog post-Panda in 2015 but I kept Aaron’s advice as it is an excellent start for beginners and still as relevant to day as it was then.
The first step in any seo campaign is to do some keyword research….
Keyword Research: Definition
In the offline world companies spend millions of dollars doing market research to try to understand market demand and market opportunities.
Well with search consumers are telling marketers exactly what they are looking for – through the use of keywords.
And there are tons of free and paid seo tools out there to help you understand your market. In addition to those types of tools lots of other data points can be included as part of your strategy, including:
- data from your web analytics tools (if you already have a website)
- running test AdWords campaigns (to buy data and exposure directly from Google…and they have a broad match option where if you bid on auto insurance it will match queries like cheap detroit auto insurance)
- competitive research tools (to get a basic idea of what is working for the established competition)
- customer interactions and feedback
- mining and monitoring forums and question and answer type websites (to find common issues and areas of opportunity)
Just how important is it in the mix for a successful SEO campaign?
Keywords are huge.
Without doing keyword research most projects don’t create much traction (unless they happen to be something surprisingly original and/or viral and/or of amazing value).
If you are one of those remarkable businesses that to some degree creates a new category (say an iPhone) then SEO is not critical to success.
But those types of businesses are rare.
The truth is most businesses are pretty much average, or a little bit away from it, with a couple unique specialties and/or marketing hooks.
SEO helps you discover and cater to existing market demand and helps you attach your business to growing trends through linguistics.
You can think of SEO implications for everything from what you name your business, which domain names you buy, how you name your content, which page titles you use, and the on page variation you work into page content.
Keyword research is not a destination, but an iterative process.
For large authority sites that are well trusted you do not have to be perfect to compete and build a business, but if your site is thin or new in a saturated field then keyword research is absolutely crucial.
And even if your website is well trusted then using effective keyword strategy helps create what essentially amounts to free profits and expanded business margins because the cost of additional relevant search exposure is cheap, but the returns can be great because the traffic is so targeted.
And since a number one ranking in Google gets many multiples of traffic that a #3 or #4 ranking would get, the additional returns of improving rankings typically far exceed the cost of doing so (at least for now, but as more people figure this out the margins will shrink to more normal levels.)
Where do you start?
What kind of ambitions do you have? Are they matched by an equivalent budget?
How can you differentiate yourself from competing businesses?
Are there any other assets (market data, domain names, business contacts, etc.) you can leverage to help build your new project?
Does it make sense to start out overtly commercial, or is there an informational approach that can help you gain traction quicker?
I recently saw a new credit card site launching off of the idea of an industry hate site “credit cards will ruin your life”.
After they build link equity they can add the same stuff that all the other thin affiliate websites have, but remain different AND well linked to.
Once you get the basic business and marketing strategy down then you can start to feel out the market for ideas as to how broad or narrow to make your website and start mapping out some of your keyword strategy against URLs.
And if you are uncertain about an idea I am a big fan of launching a blog and participating in the market and seeing what you can do to find holes in the market, build social capital, build links, and build an audience – in short, build leverage…once you have link equity you can spend it any way you like (well almost).
And (especially if you are in a small market with limited search data) before you spend lots of money on building your full site and link building it makes sense to run a test campaign on AdWords and build from that data.
Doing so can save you a lot of money in the long run, and that is one of the reasons my wife was so eager to start a blog about PPC.
Her first few campaigns really informed the SEO strategy and she fell in love with the near instantaneous feedback that AdWords offers.
What does keyword research involve?
Keyword research can be done in many different stages of the SEO process – everything from domain name selection, to planning out an initial site map, to working it into the day to day content creation process for editorial staff of periodical content producers.
And you can use your conversion, ranking, and traffic data to help you discover
- new topics to write about
- ways to better optimize your existing site and strategy
- anchor text to target when link building
Google has the best overall keyword data because of their huge search market share. Due to that selective and random filtering of Google’s data, it also makes it important to use other keyword research tools to help fill in the gaps.
In addition, many search engines recommend search queries to searchers via search suggest options that drop down from the search box.
Such search suggest tools are typically driven by search query volume with popular keyword variations rising to the top.
How much keyword research do you do on a project?
It depends on how serious of a commitment the project is. Sometimes we put up test sites that are start off far from perfect and then iteratively improve and reinvest in the ones that perform the best.
If we know a site is going to be core to our overall strategy I wouldn’t be against using 10 different keyword tools to create a monster list of terms, and then run that list through the Google AdWords API to get a bit more data about each. (EDIT – Shaun – That’s how I do it, too.)
On one site I know we ended up making a list of 100,000+ keywords, sorted by value, then started checking off the relevant keywords.
Do the keywords change as the project progresses?
Yes and no.
Meaning as your site gains more link authority you will be able to rank better for broader keywords that you might not have ranked for right off the start.
BUT that does not mean that you should have avoided targeting those keywords off the start.
Instead I look for ways to target easier keywords that also help me target harder higher traffic keywords. For example, if I aim to rank a page for “best credit cards” then that page should also be able to rank well (eventually) for broader keywords like “credit cards.”
You can think of your keyword traffic profile as a bit of a curve (by how competitive the query is and the number of keywords in the search query).
This type of traffic distribution curve starts off for new sites far to the right (low competition keywords with few matches in the search index that are thus easy to rank for based on on-page optimization, often with many words in each search query) and then as a site builds link authority and domain authority that curve moves left, because you are able to compete for some of the more competitive keywords … which often have their results determined more based on links-based metrics.
Can you give me an example how you would keyword research a specific niche – the steps you’d normally take?
Everything is custom based on if a site exists or is brand new.
And if it exists how much authority does the site have? How much revenue does it have?
There is not really a set normal.
Sometimes while doing keyword research I come to the conclusion that the costs of ranking are beyond the potential risk-adjusted returns.
Other times there is a hole in the market that is somewhat small or large. Depending on assets and resources (and how the project compares to our other sites) we might have vastly different approaches.
How would you deploy your keyword research in 3 areas – on page, on site, and in links
As far as on page optimization goes, in the past it was all about repetition. That changed around the end of 2003, with the Google Florida update.
Now it is more about making sure the page has the keyword in the title and maybe sprinkled a bit about the page content, but also that there is adequate usage of keyword modifiers, variation in word order, and variation in plurality. Rather than worrying about the perfect keyword density try to write a fairly natural sounding page (as though you knew nothing about SEO), and then maybe go back to some keyword research tools and look at some competing pages for keyword modifiers and alternate word forms that you can try to naturally work into the copy of the page and headings on the page.
As far as links go, it is smart to use some variation in those as well. Though the exact amount need depends in part on site authority (large authoritative sites can generally be far more aggressive than smaller lesser trusted websites can).
The risk of being too aggressive is that you can get your page filtered out (if, say, you have thousands of links and they all use the exact same link anchor text). (see Google Penguin Update)
There is not a yes/no exact science that says do xyz across the board, but generally if you want to improve the ranking of a specific page then pointing targeted link anchor text at that page is generally one of the best ways to do so. (EDIT – Shaun – More risky in 2015 because of ‘unnatural link‘ penalties).
But there is also a rising tides lift all boats effect to where if you get lots of links into any section of your website that will also help other parts of your site rank better – so sometimes it makes sense to create content around linking opportunities rather than just trying to build links into an unremarkable commercial web page.
Is there anything people should avoid when compiling keyword research?
I already sorta brushed off keyword density.
In addition, many people worry about KEI or other such measures of competition, but as stated above, even if a keyword is competitive it is often still worth creating a page about it which happens to target it AND keywords that contain it + other modifiers (i.e. best credit cards for credit cards).
Don’t look for any keyword tool to be perfect or exact. Be willing to accept rough ranges and relative volumes rather than expecting anything to be exactly perfect.
A huge portion of search queries (over 25%) are quite rare and have few searches.
Many such words do not appear on public keyword tools (in part due to limited sampling size for 3rd party tools and in part because search engines want advertisers to get into bidding wars on the top keywords rather than buying cheap clicks that nobody else is bidding on).
So there you have it. Keyword research for beginners from somebody who knows what he’s on about.
The Long Tail
Here’s some interesting analysis from one of Aaron’s links above:
Looking at other long tail studies, I already know broad keyword terms appear to receive the most traffic.
However, the really important point to note here is that we’ve got ~70% of total keyword traffic going to phrases 4 terms or greater, with the remainder going to one to three term phrases.
So really, it’s more accurate to say that the first one or two head term lengths a website receives doesn’t really have the most volume, as the traditional long tail chart depicts – it’s actually around 3-5 words per keyword phrase that make up the sweet spot in terms of highest volume per website.
Of course results may vary depending on the type of website and size, but I’m betting on average you’d get close to the results I came up with.
Keyword Research Is VITAL
Well, site authority aside, a large part of optimising ANY page simply involves putting keywords on it. Specific keywords. In specific places. In specific order.
Even on a granular level – NOT having one exact LETTER on the page can mean you are 100% irrelevant for Google – and it’s always been that way.
It is how Google filter and orders results.
Here’s an observation that more accurately describes this;
Hobo is a SEO Company in Scotland. We’re on-site seo largely. We try and make sites better. We operate in Greenock. We don’t promise the earth. Passionate about seo.
That text above is 30 words. Cut and paste the above into Google (or just go here). You’ll see, we have a 100% relevance with this search as all the words are on the page.
Now do the same search but add a letter to the search, or a word that is not on this page (or in links pointing to this page like this). Here I added one letter.
….Google fails to return this page, even though it has thirty of the keywords, all lined up next to each other (surely) a possible relevant match that Google clearly fails to return.
Here’s another observation:
NEAR RELEVANCY and EXACT MATCH RELEVANCY
This is an observation about traffic declines and ranking fluctuations in Google SERPS. This is an opportunity that can be detected on your site with some rank data to hand.
Traffic can decline and rankings can fluctuate for a ton of reasons – here’s one reason that I haven’t seen anyone talk about.
It is no secret that Google creates Everflux its SERPS in various ways from the obvious (from high topical relevance and, for instance, multiple indexes served by multiple datacentres) to the less obvious (location and site quality).
Google also rotates through data centres where your site does not feature AT ALL because you do not have one word on the page. That is – ‘data centre 1′ has you at no2 – and ‘data centre 2′ has you not in top 100.
Look at this image:
ABOVE is a snapshot of the performance of a 4 word keyword phrase, one of a selection of ‘secondary’ terms, I am interested in.
I track these type of keywords to understand a little about what’s going on at Google on a granular level (note – I don’t test for these, I find them).
If you understand what is happening, you’ll see in the IMAGE ABOVE that – one day I rank for the term – and the next day it does not. It goes on like that for some months….. sometimes on… sometimes off.
How can you be number 2 in Google in one particular rankings report and nowhere in the next?
Crap rank checker aside, of course, it’s because Google regularly makes small modifications about how it publishes it search engine results pages (at a very granular level) – and in this instance – in one set of results I rank (as high as) no2 and in other samplings I took the next day, another VERY SIMILAR index is served where my page does not rank at all.
But – how can that be?
How can you be RELEVANT and NOT RELEVANT if you’ve made a quality page with a few thousand words on a very clean domain? Especially when the actual phrase Google won’t rank you for is EXACTLY what the page is about?
The answer seems to lie in NEAR RELEVANCY and EXACT MATCH RELEVANCY results that Google serve us – another way Google shakes up it’s results.
A NEAR MATCH RESULT for a keyword phrase is when the ACTUAL EXACT MATCH KEYWORD PHRASE used in the search query is NOT present on the page but Google has worked out (looking at the UNIQUE keywords used on the page) that the page in question is VERY PROBABLY a 100% match for this query.
SO – the page could be 100% relevant – but because it does not have the EXACT QUERY used by the searcher in the EXACT SAME WORD ORDERING, it becomes 100% NOT RELEVANT on one, or many of Google’s set of results.
That’s pretty harsh! But… if Google didn’t do this – sites with incredible domain authority would rank for even more.
In the past – Google would send publishers a lot of NEAR MATCH longtail traffic – but now that traffic has been squeezed and is constantly squeezed every day – and it looks as though this is one way Google achieves this.
One could say Google is CONSTANTLY eating away at traffic levels that at one time in the past it would have sent you – but also – it’s sharing that traffic out amongst other sites – so it’s nothing personal – and Google has a lot of competing pages these days to choose from.
We all need to remember that Google has users to satisfy – and delivering pages with the exact match term on them (regardless of your domain authority and very relevant page) could be seen as the safe bet.
Of course – you are never supposed to know about this because Google makes it impossible to find this sort of information in Analytics with most keywords ‘not provided’.
The ONLY way you can check this is by physically checking where you rank – and regularly.
Most people only track the main keywords they want to rank for – rather than the longer tail variations of those keyword phrases – which can be where the most likely transactional customer is hiding.
It’s an easy fix on a granular level.
Just add the actual keyword phrase to that page AT LEAST ONCE and you stabilise rankings across these indexes and double the traffic to that page alone. It’s NEVER been more important or MORE REWARDING (in relative terms!) to do PROPER EXTENSIVE keyword research and know exactly what you want to rank for (and close NEAR MATCHES you SHOULD RANK FOR) and fill these relevancy gaps in Google results.
Of course – I have tested the fix for this on many occasions Ive seen this NEAR RELEVANCY issue impact rankings I can influence…..
Just adding ‘original content’ is not enough and it won’t ever be. As long as a search engine displays organic results based on KEYWORDS and LINKS and the search engine is intent on manipulating it’s results on a very granular level like this (and MANY other ways) – a search engine optimiser can always get more traffic out of any page.
Does it work this way all the time? Who knows. What makes a page encounter ‘NEAR RELEVANCY’ problems? Well, Google, depending on how it wants to sort results by RELEVANCY or AUTHORITY. YOU depending on what keywords you have placed in the text copy on that page. I *think* the example above illustrates a site losing a little domain authority that took it below a threshold that if I wanted to stabilise rankings, I needed to have the EXACT term now featuring on the page in question. It may be for a totally unrelated issue – but as with many things in seo, I’m comfortable just having the fix, for now, as most understanding about how Google works is abstract at best (on a very granular level) .
When I first encountered this years ago I was surprised to the extent by which a page can be RELEVANT and NOT RELEVANT over such a granular issue, flipping between data centres, day to day or minute to minute.
Finding these near relevancy gaps is a best practice for me, as it is the surest way to drive more relevant traffic to the site and is all 100% white hat with no risk (unless of course you are keyword stuffing the sh!t out of a low quality page to begin with). It’s why Ive built a tool to specifically identify these kind of opportunities and manage deployment of fixes. I can tell you now – the impact of this treatment over a 25 page site is not that immediately apparent – but identified over many pages on a much bigger site – it’s a guaranteed way of pushing your relevant traffic north by some way.
Believe it or not – you get the best out of this strategy by following Google’s guidelines to the letter, so we all don’t need to go blackhat just yet (unless of course it’s a point of principle to take that route in the first place).
How you respond to this kind of activity by Google is to do PROPER INDEPTH keyword research and ensure if you REALLY think you should rank for a ANY term – it really is getting that granular. Don’t rely on your domain authority or near relevancy (even if it is 100%!) to be enough – and certainly don’t just pump out unfocused text content for content sake.
Adding One Word To Your Page Can Make All The Difference
Somebody asked me about this a simple white hat tactic and I think what is probably the simplest thing anyone can do that guarantees results.
The chart above (from last year) illustrates a reasonably valuable 4 word term I noticed a page I had didn’t rank high in Google for, but I thought probably should and could rank for, with this simple technique.
I thought it as simple as an example to illustrate an aspect of onpage seo, or ‘rank modification’, that’s white hat, 100% Google friendly and never, ever going to cause you a problem with Google. This ‘trick’ works with any keyword phrase, on any site, with obvious differing results based on availability of competing pages in SERPS, and availability of content on your site.
The keyword phrase I am testing rankings for isn’t ON the page, and I did NOT add the key phrase…. or in incoming links, or using any technical tricks like redirects or any hidden technique, but as you can see from the chart, rankings seem to be going in the right direction.
You can profit from it if you know a little about how Google works (or seems to work, in many observations, over years, excluding when Google throws you a bone on synonyms. You can’t ever be 100% certain you know how Google works on any level, unless it’s data showing your wrong, of course.)
What did I do to rank number 1 from nowhere for that key phrase?
I added one keyword to to the page in plain text, because adding the actual ‘keyword phrase’ itself would have made my text read a bit keyword stuffed for other variations of the main term. It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research – and knowing which unique keywords to add.
This illustrates a key to ‘relevance’ is…. a keyword. The right keyword.
Yes – plenty of other things can be happening at the same time. It’s hard to identify EXACTLY why Google ranks pages all the time…but you can COUNT on other things happening and just get on with what you can see works for you.
In a time of light optimisation, it’s useful to earn a few terms you SHOULD rank for in simple ways that leave others wondering how you got it.
Of course you can still keyword stuff a page, or still spam your link profile – but it is ‘light’ optimisation I am genuinely interested in testing on this site – how to get more with less – I think that’s the key to not tripping Google’s aggressive algorithms.
You can use these keyword research tools to quickly identify opportunities to get more traffic to a page.
Google Analytics Keyword ‘Not Provided’
Google Analytics was the very best place to look at keyword opportunity for some (especially older) sites, but that all changed a few years back.
Google stopped telling us which keywords are sending traffic to our sites from the search engine back in October 2011, as part of privacy concerns for it’s users.
Google will now begin encrypting searches that people do by default, if they are logged into Google.com already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive “referrer” data that reveals what those people searched for, except in the case of ads.
Google Analytics now instead displays – keyword “not provided“, instead.
In Google’s new system, referrer data will be blocked. This means site owners will begin to lose valuable data that they depend on, to understand how their sites are found through Google. They’ll still be able to tell that someone came from a Google search. They won’t, however, know what that search was. SearchEngineLand
You can still get some of this data if you sign up for Google Webmaster Tools (and you can combine this in Google Analytics) but the data even there is limited and often not entirely the most accurate. The keyword data can be useful though – and access to backlink data is essential these days.
This is another example of Google making ranking in organic listings HARDER – a change for ‘users’ that seems to have the most impact on ‘marketers’ outside of Google’s ecosystem – yes – search engine optimisers.
Now, consultants need to be page centric (abstract, I know), instead of just keyword centric when optimising a web page for Google. There are now plenty of third party tools that help when researching keywords but most of us miss the kind of keyword intelligence we used to have access to.
Proper keyword research is important because getting a site to the top of Google eventually comes down to your text content on a page and keywords in external & internal links. All together, Google uses these signals to determine where you rank if you rank at all.
Example: How Simple Keyword Research Works To Drive Traffic
Scenario – I had a post a few years old on a blog that got a decent amount of search traffic every month. It received @200 on topic relevant visits every month for a year. It would be nice if this page (if still relevant advice) got 500 visitors a month I thought. That’s what i thought so I had a look at the post and had a think about how I get more out of the post without a manipulative link building campaign to increase ranking for a specific keyword.
There was 8521 words on the page (including comments) – the actual article was 5683 words long – perfect for the long tail of search and ideal article to test this out on.
Comments where closed the entire duration of the graph below:
Here’s the simple steps I took to get a post that averaged @200 visitors to a post that last month got @700 visitors from organic (natural) search (mostly from Google).
It was a blog post, so the title was kind of short and snappy, with the keyword once.
Title length was under 70 characters so it could be read easily in Google. It had picked up some good relevant links, and was on a mildly trusted domain, and continues to pick up natural links from forums etc.
This is the perfect type of post to do this with.
- I have the content, but what keywords are used by folk to search for that content. The Google Keyword Tool might not be 100% accurate, but I used it to quickly identify what Google says is the most popular searches relevant to my post. I use SEMRUSH usually for more indepth analysis.
- What keyword traffic am I already getting to that page? I looked at the traffic the page was already getting, using Google Analytics, and identified the keyword phrases searchers were using to find this post. (Not that you can do this at all in 2015)
- Looking for keyword phrases that drove traffic that I might rank for. I took all these keywords and put them in a rank checker, and over some time identified the top 50-100 referring keyword phrases from both the above sources THAT THE PAGE WASN’T ALREADY NO1 for. At the same time I could see terms I was no1 for that got very little traffic.
- Important keyword Phrases In The Page Title. I optimised the front of my title tag for the two top performing keyword phrases
- Long Page Titles. I used a LONG TITLE ELEMENT and basically didn’t write ‘best practice’ for search engines, I wrote a massive title with all the best performing keywords I could find in Google analytics (THAT I WASN”T ALREADY NUMBER 1 FOR). I did make the beginning of the title readable, but heavily weighted towards what the actual keyword phrase Google said was popular. Title length ended up being over 150 characters – but relevant, and plain English, so not too spammy either IMO
- Content Tweaking. I modified the content, making it more ‘relevant’, and introducing the odd word here and there that my analysis hinted at there might be more traffic if I was higher up ( like to ensure the exact phrases i am targeting on the page is on the page, for instance)
- Internal Links. I made sure in my internal navigation I linked to the page with the best performing key phrase opportunities from my analysis to introduce them into my link profile (once – just to get them in there – old habit).
Simple stuff for more traffic.
More Keyword Research Tips
Here’s some good articles for beginners to learn more: