QUOTE: “keyword density, in general, is something I wouldn’t focus on. Search engines have kind of moved on from there.” John Mueller, Google 2014
There is no “best” keyword density. Write naturally and include the keyword phrase once or twice to rank in Google and avoid demotion. If you find you are repeating keyword phrases you are probably keyword stuffing your text.
There is no one-size-fits-all optimal ‘keyword density’ percentage anybody has ever demonstrated had direct positive ranking improvement in a public forum.
I certainly do not believe there is a particular percent of keywords in words of text to get a page to number 1 in Google. While the key to success in many niches is often simple SEO, search engines are not that easy to fool.
I do think you run the risk of tripping keyword stuffing penalty filters if you, for instance, were to keyword stuff a page and every element on it with your focus terms.
I write natural page copy which is always focused on the key phrases and related key phrases. I never calculate density in order to identify the best keyword density percent as there are way too many other things to work on. I have looked at this, a long time ago.
Normally I will try and get related terms on the page and I might have the keywords I am focused on in just a few elements and on the page text.
QUOTE: “KEYWORD DENSITY’ is simply a PERCENTAGE value of the NUMBER OF TIMES a KEYWORD or KEY PHRASE appears on a WEB PAGE compared to the TOTAL AMOUNT OF WORDS on the page.” WIKI
The THINKING is that GOOGLE will order SEARCH ENGINE RESULTS PAGES and reward with HIGHER RANKINGS a page that has a ‘specific’ TARGET keyword density percentage value.
In simple terms – this theory would mean that – if you took TWO WEBSITE PAGES – a page with a keyword density score of say, 2% would OUTRANK a page with a score of 1%. OR – that there is actually a specific keyword percentage that if you score a BULLSEYE on, always results in HIGHER RANKINGS in Google.
We know Google has 100s of potential (and SECRET) search engine ranking factors.
A claim often made is that keyword density is one of these factors.
I do not think that is the case, and there are many other areas of SEO that is far more important than keyword density.
How to Calculate Keyword Density?
There is, of course, a FORMULA to work out the local keyword density of any page. Actually – there are some different variations out there on the web:
- Density = ( Nkr / ( Tkn -( Nkr x ( Nwp-1 ) ) ) ) x 100
- Density = your keyword density
- Nkr = how many times you repeated a specific key-phrase
- Nwp = number of words in your key-phrase
- Tkn = total words in the analysed text
The keyword density score for a keyword is calculated looking at how many times a specific key-phrase is repeated in a document, the number of words in that key-phrase and the total number of words in the analysed text.
Its natural to think its then important to try and identify that ’SWEET SPOT’ to achieve high rankings in Google. The question is – what is the optimal keyword density? is it 3% or 0.3% or 33% ??
If you look online – you’ll find a LOT of conflicting OPINION on the ideal keyword density %.
It all sounds very scientific.
What Does Google Say About Keyword Density As A Ranking Signal?
John Mueller of Google stated in 2014. –
QUOTE: “keyword density, in general, is something I wouldn’t focus on. Search engines have kind of moved on from there.”
Before that, in 2011, Matt Cutts went on record to point out the ideal keyword density is a misconception.
QUOTE: “That’s just not the way it works….. Continue to repeat stuff over and over again then you are in danger of getting into ‘keyword stuffing’.”
Google Webmaster Guidelines state:
QUOTE: “Keyword Stuffing…. results in a negative user experience, and can harm your site’s ranking.”
Google gives advice that can have a few different interpretations.
It’s useful to know what experienced search marketers say about this, too:
What Do SEO think About Keyword Density As A Ranking Signal?
Aaron Wall of SEOBOOK called keyword density an:
QUOTE: “overrated concept”
Jim Boykin noted:
QUOTE: “Using a RATIO of keywords to the total text on a page is not a good metric for SEO.”
Bill Slawski said, after seeing very little mention of keyword density in search engine patents over many years:
QUOTE: “I’ve always considered keyword density to be more likely FOLKLORE than fact.”
Rand Fishkin of Moz said that:
QUOTE: “the TRUTH is simply that modern search engines have never used keyword density”.
Dr. Edel Garcia (one of the few information retrieval SCIENTISTS whose crossed over into seo) made clear back in 2005 that
“QUOTE: (a keyword density ratio) tells us nothing about:
- 1. the relative DISTANCE between keywords in documents (proximity)
- 2. where in a document the terms occur (DISTRIBUTION)
- 3. the co-citation frequency between terms (CO-OCCURRENCE)
- 4. the main theme, TOPIC, and sub-topics (on-topic issues) of the documents
Garcia states this would imply that “KD is divorced from content quality”.”
In his article “The Keyword Density of Nonsense”, Garcia summed up:
QUOTE: “the assumption that keyword density values could be taken for estimates of term weights or that these values could be used for optimization purposes amounts to the Keyword Density of Non-Sense.“
Top SEOs have been saying that there is NO optimal keyword density for a long time before Google confirmed it.
Is Keyword Density Of Any Use?
It is useful for a copy editor to be aware of keyword density values of a particular phrase on a page to avoid keyword stuffing text. Google’s quite possibly, is not using keyword density % values to even identify spam or to apply keyword stuffing filters. Quality raters are asked to examine text for keyword stuffing, though.
What Should Your Keyword Density Be?
Despite what many SEO Tools would indicate there is no “best” keyword density. Write naturally and include the keyword phrase once or twice to rank in Google and avoid demotion. If you find you are repeating keyword phrases you are probably keyword stuffing your text.
And that’s probably going to ‘hurt a little’, at some point, just like Google’s Matt Cutts said it will.
When copy is limited, why repeat a 3-word keyphrase 10 times, and risk keyword stuffing penalties for a bad user experience, when there are possibly 10 variations and synonyms of the same key phrase, that when added to the same page, makes the page more relevant, better quality and rank higher for lots of similar keywords?
As Aaron Wall said:
QUOTE: “Each piece of duplication in your on-page SEO strategy is ***at best*** wasted opportunity. Worse yet, if you are aggressive with aligning your on page heading, your page title, and your internal + external link anchor text the page becomes more likely to get filtered out of the search results (which is quite common in some aggressive spaces).
SO – the sensible thing to do would be to avoid keyword stuffing your PRIMARY CONTENT text.
Also – it’s probably wise to invest a little time in making your page RELEVANT but all the time keeping it simple:
I focus on keyword stemming opportunities… with a focus on the LONG TAIL of search as well as the HEAD:
- seo tools
- best seo tools
- best seo tools for beginners
I also focus on the relative prominence of the term in the document, for instance:
- Is the key phrase in the <TITLE> element
- Is the key phrase in <p> tags
- Is the key phrase in the <alt> text
- Is the key phrase in the URL
I focus on introducing more unique words, single and plural, abbreviations, synonyms and co-occurring phrases relevant to the topic of a page.
This is a better use of time than calculating keyword density percents of one keyword phrase.
There is no magic one size fits all SEO tactic. It’s not that easy, out-with black-hat tactics. Not that black hat SEO is always easy, either.
Google’s trending towards rating and ranking your pages based on the quality and reputation of your website, and the quality and reputation of content on individual pages.
Google’s interested in the expertise of the actual PERSON WRITING the text it is rating it and is working out if USERS actually LIKE your page, RELEVANT to COMPETING PAGES on the web.
They are also more interested in detecting if users are actually seeking out your page amongst the competition.
Google’s more interested in user satisfaction signals, and you should be too.
More so than keyword density, for sure.
Keyword Stuffing (Irrelevant Keywords)
QUOTE: “Keyword Stuffed” Main Content Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017
Keyword stuffing is simply the process of repeating the same keyword or key phrases over and over in a page. It’s counterproductive. It’s is a signpost of a very low-quality spam site and is something Google clearly recommends you avoid.
QUOTE: ““Keyword stuffing” refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site’s ranking in Google search results.“ Google Webmaster Guidelines 2020
Keyword stuffing text makes your copy often unreadable and so, a bad user experience. It can get a page demoted in Google but it depends on the intent and the trust and authority of a site. It is sloppy SEO.
It is not a tactic you want to employ in search of long-term rankings.
Just because someone else is successfully doing it do not automatically think you will get away with it.
Don’t do it – there are better ways of ranking in Google without resorting to it.
John said in a 2015 hangout “if we see that things like keyword stuffing are happening on a page, then we’ll try to ignore that, and just focus on the rest of the page”.
Does that imply what we call a keyword stuffing “penalty” for a page, Google calls ‘ignoring that‘.
From what I’ve observed, pages can seem to perform bad for sloppy keyword phrase stuffing, although they still can rank for long tail variations of it.
QUOTE: “The bottom line is using more relevant keyword variations = more traffic”. Aaron Wall, 2009
He goes further with still excellent piece of advice, today:
QUOTE: Each piece of duplication in your on-page SEO strategy is ***at best*** wasted opportunity. Worse yet, if you are aggressive with aligning your on page heading, your page title, and your internal + external link anchor text the page becomes more likely to get filtered out of the search results (which is quite common in some aggressive spaces). Aaron Wall, 2009
Google’s advice in the past about keyword stuffing was to:
QUOTE: “Just find the hidden text or the keyword stuffing and remove it.” Matt Cutts, Nelson, Google 2013
Focus On The User
As Google says in their manifesto:
QUOTE: “Focus on the user and all else will follow.” Google, 2020
It is time to focus on the user when it comes to content marketing, and the bottom line is you need to publish unique content free from any low-quality signals if expect some sort of traction in Google SERPs (Search Engine Result Pages).
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015
Those in every organisation with the responsibility of adding content to a website should understand these fundamental aspects about satisfying web content because the door is firmly closing on unsatisfying web content.
Low-quality content can severely impact the success of SEO.
SEO copy-writing is a bit of a dirty word – but the text on a page still requires optimised, using familiar methods, albeit published in a markedly different way than we, as SEO, used to get away with.
Opinions: What Is The Ideal Keyword Density For A Page?
Almost ten years ago, I asked some of the world’s top Google SEO people and bloggers what they thought about keyword density (KD) in SEO after talking privately about the subject with Tedster, of Webmasterworld (RIP). The discussion revolved around the question of ‘is there an ideal, a perfect or safe amount to aim for to improve rankings?’
- Keyword density (SEO) is the percentage (%) of times a keyword or key phrase appears on a web page in comparison with the total number of words on the page. …
QUOTE: “Hi Shaun, Did you catch my little provocation in the SEOmoz interview? My point of view may not be the majority opinion among webmasters, but I came to it by studying data from the SERPs (Search Engine Results Pages) (there’s quite a wide variation in KD) and by reading the search engine patents of recent years. That especially includes Google’s six phrase-based indexing patents, as we discussed on WebmasterWorld
And now for some history. In the 90s this idea caught fire that there was a movable “sweet spot” in the ranking algorithms for KD. The idea was that the dial would get turned all the time, especially at AltaVista – which was the “do or die” place to rank in those days. Some early SEO software attempted to reverse engineer the various theoretical sweet spots in the algorithms on a monthly basis – for density, prominence, occurrence and other factors.
That was the 90s, with search engine algorithms that were dumb as a doorpost. Whether any of them really used KD as a direct metric I can’t say with certainty – but I even doubt that. At any rate, today’s algorithms handle keyword stuffing abuses almost as a side effect of the many elements they are processing. They don’t even NEED to take a direct measurement.
This doesn’t mean that a density tool can’t give a Webmaster some useful feedback. It can alert you when you go way overboard and don’t realize it. Likewise, you’ll get a wake-up call if you overlook having even a single use of your target keyword in text.
With so many tools online to attract eyeballs, this idea seems to be a myth that will not die. Many Webmasters swear by it and just assume that density is somehow a sophisticated SEO tool that they must use to succeed online.
But among professional SEOs, you won’t usually hear such talk. For example, Rand Fishkin and I see eye to eye on this. Check out his article on Moz, where he surveyed 37 prominent SEOs about search engine ranking factors. The word “density” is not even on the page!”
QUOTE: Like everything in search – it has evolved. I think the old kw density calc is the new proximity calc.
- If the keyword isn’t on the page – it isn’t going to rank well (or at all) for that keyword.
- If the keyword isn’t in the title of the page, it is going to be tougher to rank for that keyword.
- If the keyword isn’t in the url, the task becomes more difficult.
- What about in a big header on the page?
- What about high on the page, or strategically spaced throughout the document?
- Offsite density? Anchor text is another type of density.
I think KD needs to be changed to proximity density. It is closer heat map today than the pure numbers game of old.
Aaron Wall; ‘An Over-rated Concept’
QUOTE: I think KD is an over-rated concept. Even with similar keyword densities one page may rank while another does not. And that’s true even if they have the same link profile. That in and of itself should show the (lack of) value of KD.
To explain how that concept works, consider a page that uses the exact same keywords at the start of the page title, at the start of their h1 tag, and in all their inbound anchor text. It may get filtered for being too closely aligned with the target keyword. Now imagine that the same page is redone, shifting word order is some spots, shifting singular to plural in some spots. Now the same page may not get filtered even if it has the same or similar KD.
KD also has two toxic side effects. Some people write what ends up sounding like robotic copy. Others, in an attempt to increase KD, end up editing out important keyword modifiers and semantically related phrases, which not only lowers their traffic (since they took many relevant words off the page), but also makes their page look less like other top ranked pages.
Ruud Hein; ‘The idea of KD has the attractiveness of the flat earth argument: it “just makes sense” and “everyone can see it for himself!’
QUOTE: It seems common sense that a document about Google will use the word Google more often while a document about Yahoo will use the word Yahoo more often. It also seems common sense that there should be some kind of cut-off point after which things don’t become more relevant upon repetition but instead become spam.
In other words: there must be an optimum ratio of keywords:words. KD! Ta-da!
The idea: if you are within a certain range, the “sweet spot”, you’re relevant. Under it and you’re irrelevant. Over it and it’s spam.
There are some clues we can use to figure out if our “well, it must be so” observations are correct or not.
A very compelling clue is that search engines are in the science of information retrieval — and that in the science of information retrieval KD doesn’t play a role. Apart from academic “proof of (non) concept” models, there are no information retrieval models based on KD, certainly not commercial ones. This should be more than a clue to us. It should be an annoyingly loud alarm bell: if I reason with theory of KD but the very science behind search engines doesn’t give that theory any credibility … am I still on the right path?
Another clue comes from thinking about the words we use. One document has a KD of 3.25%, another a KD of 0.05%. Which one would be in the relevant KD range? … Now what if I were to tell you that the 0.05% keyword is mataeotechny (an unprofitable art or science… like KD), a word that appears 55 times on the web (56 times now…)? Some words “weigh” more, “mean” more simply because they’re less used than others. The theory of KD as a prediction model of relevancy fails terribly here, giving enormous weight to commonly used words and hardly any to rare words.
Yet another clue is the formula to arrive at “relevant” KD. That formula goes “number of keywords on words” then some magic happens “is relevant or not”.
If KD were to be used to provide some kind of cut-off point, some kind of spam filter…. how would the cut-off point be calculated? By calculating the KD of every document, then taking the means of that? But what about our mataeotechny example? Oh, you would like to account for words that appear less often in the index? You just left the KD building and crossed the street into term weights.
If your gut keeps telling you this just has to be true, I recommend reading and rereading the articles by Dr. E. Garcia until you either “get it” or can show for yourself where he blunders.
QUOTE: Shaun – Repetition of keywords seems to have at least some effect on the rankings for those terms, particularly when combined with other factors such as the use of heading tags and title tag. However the effect is quickly lost if you stuff the keywords.
If you imagine that the glass can only contain a finite amount of liquid and your keywords are separate glasses, the more keywords the more glasses how you divvy up the liquid is almost irrelevant as you still have only a certain amount of liquid to start with.
Lyndon Antcliff; ‘I don’t do it mathematically’
QUOTE: Yes and no. I don’t do it mathematically, but I make sure the keyword is there, and in the title and h1 tags ect.
I guess I have done it long enough I don’t really think about. I think the antonyms and synonyms are more important than density, in fact there are a number of factors which are.
But I think it’s best not to obsess and concentrate on a natural feel , if that is achieved correct KD will come naturally.
Sebastian; ‘Optimal percent is a myth’
QUOTE: Oh well, I thought that thingy was beaten to death already. “Optimal KD” is a myth.
Todays search engines are way to smart to fall for such poor optimization methods.
Even a single inbound link with a good anchor text can boost a page lacking the keyword in question so that it outranks every page with tuned KD.
QUOTE: Focus more on writing good relative content, proper page structure and decent link building than KD. I remember when I first started in SEO, I had an Desktop Software Application checking my pages and telling me that I was short in my KD. So I stuffed more keywords in till the application was happy.
Then I released it into the search engines. The page never really ranked that highly. What was worse the client wasn’t too happy that his page read crap as well. I’ve never looked at KD since.
Barry Welford; ‘KD gets less and less relevant all the time…’
QUOTE: Hi Shaun – Happy to get involved. KD gets less and less relevant all the time, at least for Google with Latent Semantic Analysis, Personalized Search, etc., etc.
Most results come from the ‘long tail’ of combinations of keywords. What counts is conversions to sales, if that’s your real business objective. Poorly executed SEO may even work against conversions if it turns off human prospects.
John Carcutt; ‘Natural language seems to fare just as well if not better’
QUOTE: Ask around; what is the best keyword density for a web page to rank well for a given term? Searching on the internet I found answers ranging from 2% to 12% and one as high as 20%. The interesting thing is they could all be right.
The one thing many people fail to take into consideration when looking for this magical number is the idea that it changes based on factors related to the page or search term. Additionally, its importance in the algorithm may also fluctuate based on external influences. Instead of hunting for that perfect density, it may help to better understand what part keywords play in getting a page ranked.
I shouldn’t have to say it, but unfortunately I do; a keyword or phrase needs to be on the page in order to rank well for the term. Can a page rank if the term is not on the page? Sure if it has inbound links using the terms, but it’s not going to rank very well on those alone. Using the keyword or phrase in a variety of ways throughout a page will greatly increase the chances of showing up higher in the rankings for that term.
Now back to density… Proper KD is a moving target. Two main factors are the total amount of words on a page and the competitiveness of the phrase in the engines.
When there are very few words on a page 6% density is a tough target to hit and make the copy readable. However, when the page has a large amount of copy 6% is much more manageable. When analyzing a page 6% of 1000 words may seem much less “spammy” than 6% of 100 words. The optimal KD of a page will change based on how many total words are on the page.
If a keyword phrase is unique and the competition in the search engines is low, a much lower or much higher KDv may work just fine. The overall effect density has on search results is much broader when there is little or no competition. As the competition for a phrase increases, the KD target becomes more critical. Ironically, the density also plays a smaller and smaller part in ranking as the competition for a phrase increases.
To be fair, I tell people on a regular basis to target a 4% KD on a page. I do this primarily to get them thinking about how to use keywords on a page. I find having a set target is a good motivator and really helps a Webmaster or site owner to understand the importance of targeting a page to a specific phrase or set of words.
The hunt for the perfect KD is slowing down as more people realize natural language seems to fare just as well if not better in the search engine results. If you understand the fundamentals of targeting a page for a phrase, there is no reason to worry about KD. Just write good copy.
QUOTE: KD plays a significant role in ranking but like your meta data, domain age, backlinks, anchor text, or any other aspect of your page and domain, how your page ranks is always determined by the sum of its parts. Surrounding content and the amount of times in which your keyword phrase appears says a lot about the page and what it means. In fact, it should be obvious that its one of the most important indicators.
Using simple techniques such as bolding your phase or placing it within H2 tags will stress the importance of this phrase when your page is being crawled. Other things to consider would be placement within the page URL, title, description, and linking your phase to a site that also speaks to the content you’re creating. Be sure not to over-do-it however. If you’re keyword stuffing and it looks spammy to you then the chances of it looking spammy to a bot are probably pretty high. After you create your page you can use a simple density checking tool like http://www.ranks.nl/tools/spider.html to see how often your phrase is showing up.
Bill Slawski; ‘more likely folklore than fact’
QUOTE: Shaun – Just for a different perspective, I took a look at the USPTO database, which only goes back to the early 2000s, and at Google Scholar.
There are 15 granted patents and 48 patent applications that use the phrase “keyword density.” None of those are from Google or Yahoo, and only a very few are from Microsoft and IBM, which also work in enterprise search. A number of the patent filings were applied for by Overture around the time of their acquisition by Yahoo, but focus upon paid search, referring to KD as something that non paid search may be using.
Google Scholar reveals 208 instances of the phrase KD and none of the documents listed appear to come from anyone working at a major search engine, though a 2006 paper from a Lycos researcher suggests the use of KD.
I’ve always considered KD to be more likely folklore than fact. I don’t think that will change.
Jim Boykin; ‘not a good metric for SEO’
QUOTE: Using a ratio of keywords to the total text on a page is not a good metric for SEO anymore.
Yes, your keywords should be on the page…but beyond that, writing “naturally” is better SEO than worrying about KD.
Shana Albert; ‘I don’t use a calculator’
QUOTE: Personally, I don’t use a calculator… nor do I don’t count the words in my post, but I am careful about the keywords I choose and I do eyeball my posts to see how long it is roughly. I’ve been a Webmaster enough years now that I don’t need to calculate the amount of words in my articles to know roughly how many keywords I would need to make the KD about right.
I have found that if I worry about the amount of times that a keyword or key phrase needs to appear throughout one of my posts or articles then my writing doesn’t flow very well. And, if my articles don’t flow well…. I’m going to lose my readers. If the people arriving on one of my sites don’t enjoy reading my work it doesn’t really matter if readers can find my in the SERPs or not….. they won’t be sticking around long enough to finish reading my choppy, non-flowing article. So, I try to worry less about keywords and more about content.
Don’t get me wrong…. I still think about KD. It’s just not my main focus….the content is. I come up with the keyword(s) I want to focus on in my post and then write. If I need to tweak my post with more or less keywords once it’s written…. I do so then.
Tad Chef; ‘I stopped “measuring” KD years ago’
QUOTE: I stopped “measuring” KD years ago. Instead I concentrated on keyword placement on the page using a rule of thumb stating that 3 instances of a keyword in the page copy is the minimum plus one for each 100 additional words you write makes sense. So I focused on the “where in the copy” using the keyword in the first sentence of the of the first paragraph etc. A year ago Google introduced the “Google bomb filter” which in practice checks if a page that is linked with a certain anchor text also contains this keyword.
At the end of 2007 I could test this as a client of mine was unable to grant me access to his site for internal reasons and I had to start with off site optimization first. He did not rank at all for the keywords I did link building for as long as the pages I linked did not contain the keywords.
So it is obvious that you still have to tell Google on the page what it is actually about. So you might want to check out which terms or words are the most used on your page. On the other hand you should always think of the user first as some terms just aren’t suitable to be repeated too much. Google does an increasingly good job at identifying
synonyms, acronyms and different spellings as one and the same term. So try to sound natural above all as otherwise the engine will find you but your visitors will bounce. Btw. Yahoo does not like high KD at all.
Matt Ridout; ‘I never calculate KD’
QUOTE: This is a topic I’ve heard a lot about from all corners of the web and everyone seems to have a varied opinion on it. I can only base by answer on my personal experience and my clients experience.
Is it a myth – no. If you want to rank for a keyword it obviously needs to be visible on the page, this should be a common understanding. Not just in the body copy but tagged appropriately and in the page title, description etc.
I never calculate the KD at all, it’s like saying to an artist you have too much red on your canvas, use a calculator to work out how much more to add or subtract from the painting. If you follow simple SEO guidelines and do good keyword research you should be fine. At the end of the day it’s about the user experience on your site that you should be concentrating on, and stuffing a page full of keywords will just take something away from their experience and could harm your brand.
Bill Hartzer; ‘I don’t spend a lot of time measuring KD’
QUOTE: At this point in the game, in 2008, I don’t spend a lot of time measuring KD. I believe that, overall, there are a lot of other factors that weight in just as much–if not more–than KD.
If you feel that you need to measure it, I would take a look at the current search results pages: measure the KD of the top 5-10 pages that are ranking well and get an average. I wouldn’t go too much higher or too much lower than what the average KD is on those pages that are already ranking well.
But again, I recently overheard a search engineer say, “KD is the biggest myth out there right now.”
Hamlet Batista; ‘two fundamental flaws’
QUOTE: I don’t believe modern search engines use KD as one of their query-dependent ranking factors. It, as we know it, has two fundamental flaws:
- KD is only a local weight. The fact that a word appears many times on an specific page doesn’t help much in telling what is the page about when comparing it to other pages in the index. For example, what if the word that repeats the most is “www”? Google counts 21,940,000,000 documents with that word. That is probably not what most of those pages are about.
- Keywords density is easily manipulated by enough repetition.
I believe, as explained by Dr Garcia, that what search engines really use is term/keyword weights. Term weights don’t have the same flaws KD has.
Keyword weights are computed by : KW = Local* Global * Normalization.
- Keyword weights consider both local and global weights. A phrase that appears many times in a document but also appears in many other documents should have less weight than one that doesn’t appear as often. We can call this “rarity”. The only way search engines can tell documents apart is by paying attention to what words make them different. This is possible thanks to the Global component of that equation.
- Keyword weights are normalized. In order to avoid the difference in document sizes and repetition issues, weights are normalized. That is, their values are replaced by corresponding (directly proportional) values between 0 and 1.
The vector space model is one approach that has been explained as a way to measure the term weights. The cosine similarity is a very interesting concept that if/when current search engines implement it, we will see search results where the keywords do not appear in the content of the page or the text in the links pointing to the page. I personally don’t think vector space model is currently in use in modern search engines. The size of the vectors to make such computations at query time is simply too big. PageRank computation uses matrices of massive size, but PageRank is query-independent and it is pre-computed before any query is performed.
From the SEO point of view, I do see some limited use for KD, though. Let’s say for example, when you are simply comparing a single page to another for a very specific keyword you are targeting. Remember that when search engines compute the weights they are trying to determine the relevance of each page; but when we see the page ranking we already know that. So, we only need to determine why the search engine deemed that page important for that particular phrase. Assuming off-page factors are the same/similar, the KD can be useful in figuring that out as the term weight will be directly proportional.
Comparing top ten pages, averaging their values and thinking about a perfect KD of x% is definitely a waste of time.
QUOTE: hmmmm….. Is ‘KD’ sh|te? I love that term. Write sh|te in exactly 2.5 percent of the total words in your post and you “rank number one for sh|te”.
That is the basis for KD. Obviously, every document has a specific KD for any given keyword. That doesn’t mean that Google has weighted each word so that tweaks in word number would always improve rankings.
The best any SEO can do? Test, make a hypothesis, test again. Even with multiple reiterations, the test would still only provide anecdotal evidence KD matters. Plus, there no way to isolate KD or any element from all the variables of a test, as well as the dynamic natural search landscape. Do your SEO competitors have zero impact on your SEO tactics?
The effectiveness of optimizing metadata elements always stimulates great debates. What is unique about KD? It ostensibly has an ‘optimal’ percentage. That is one reason why the KD theory is so often ridiculed.
The most-cited debunking of the myth, The Keyword Density of Non-Sense, was written by Dr. Edel Garcia (Orion), whose good friend, Mike Grehan, asked him after SES New York (2005) to do something about the unproven KD theories swirling around.
You can find the study in Mike Grehan’s newsletter, then co-authored with Christine Churchill, CEO of Key Relevance.
Garcia wrote an analysis combining IR (information retrieval), semantics and math but “no conclusion so readers could draw their own”.
Nacho Hernandez brought this article to Rand Fishkin’s attention in the Search Engine Watch Forums. Rand was 90 days into developing a tool to measure on-page term weight. After reading Orion’s article, he concluded “only an extraordinary budget and very talented programmer could build such a thing.”
There’s a grain of truth in KD theory: Google does look at KD in spam reduction, setting an undefined upper limit on keyword stuffing. Michael Gray has even debunked that concept with anecdotal evidence, showing how insanely high KD can rank high.
Sexy SEO; ‘Snake Oil SEO’
QUOTE: KD? Why don’t you ask about meta tags or submit robots instead? Do you think I am ancient enough to remember that mouldy question of early 90s? Well, believe me I am not! But I have something to say, but only if you ask. Honestly, it’s a great gimmick of all those snake oil SEOs who hit their customers and run away with their dollars. Yes, the concept is easy to grasp and even the dumbest of the dumb will see that you are doing some work on their site. Yes, it might possibly push the page in question 10 positions up in SERPs from page 2000 to page 1999. Your customer will even see the result this way. Ugly, dirty, but it works. Great concept.
Now seriously, if 10 years ago it might have been one of just about a dozen factors counted by SE in their ranking algorithms, nowadays they’d become a way more sophisticated. Certainly it never harms to have your target keywords in the text of your page and preferably not in one sentence, but that’s ABC of SEO. It sometimes helps to have one keyword of a pair to be repeated much more often than the second one. And no doubts you should use your target keywords in URL, Title, meta tags etc but that’s not even KD proper.
How would you see the degree in which on-site optimization decreased over the past 10 years? Now how about on-page optimization? Well, the importance of KD as a ranking factor decreased proportionally, and even if some might find this fact to a bit inconvenient for their sales tactics, it’s still stands as a fact.
No time-wasters next time please!
Wiep Knoll; ‘better to focus on keyword presence’
QUOTE: Instead of looking at KD, I think it’s better to focus on keyword presence. Make sure that you’ve put the keyword(s) you’re targeting in your page’s title tag, meta description and in the content part.
Don’t stuff in extra keywords just to get that magic 3,22% or 7,08% KD (or whatever percentage you’re aiming to get), but make it look natural in stead.
If you let someone else read the text and he or she thinks it’s a good read and can explain exactly what the page is about, you’re probably ok. The anchor texts of incoming links and the surrounding text of those links will do the rest…
Brian Clark; ‘KD a non-factor’
QUOTE: As far as I’m concerned for Google KD is a non-factor. I’m not saying the algorithms don’t take it into account at all, but I am saying it’s a bit fruitless to even worry about. Plus, in this day and age of the link and conversion mattering most, worrying about KD when you should be focusing on clear, actionable copy seems to be beside the point.
Keyword frequency matters to a certain degree, one would think. But again, if your writing comes off stilted and awkward, you’re shooting yourself in the foot. Pay attention to titles and subheads, and creatively make the keywords and copy flow at those crucial points. Then go back and read the rest afterwards.
If your keywords and a few choice synonyms didn’t show up naturally in the body text, you’re probably not covering the topic all that well.
Brian Turner; ‘Do I use KD? No’
QUOTE: It’s always important to properly utilise keywords on a page in such a way as to describe
- the meaning of the page,
- the uniqueness of the page, and
- the action required for users (if any) on the page.
Google & co have published various pages over the years that show that:
- they understand that there are linguistic relationships between certain types of words, whether between individual keywords or even acronyms, and
- block analysis should be presumed to be already in play, so work as though search engines can determine the meaning not simply of paragraphs, but also of individual blocks of text.
Page copy should ideally look to justify the keywords in the titles, headers, and further links by directly referencing these in the text, plus related keywords as required, and all in a format that enhances readability for human users in the relevant text areas of a page.
Do I use KD? No – I think the aim is to write intelligent copy and it’s important to bear in mind the impact of major ranking factors such as domain authority, page titles, and links (on-page and off-page).
If non-SEOs try to focus on KD I think they are more likely to both overlook these, and additionally treat KD as nothing more than a way to reduce useful pages into unreadable spam that denigrates the user experience, have little or no ranking impact, and prevent the page from converting as intended.
However, if a really good SEO copywriter uses any particular method in their craft, I’m not going to denigrate it as the most important thing in my opinion is simply a successful outcome, regardless if any part of the process may seem esoteric to outsiders.
QUOTE: I never aim for a ‘good’ KD for Google – I firmly believe the word only needs to be featured once on a good quality web page. If a keyword phrase is in the links, anchor text, title and on the page that’s good enough. I never calculate KD, I never use KD tools or density checkers to try and measure a optimum KD – if I have time to calculate KD, I should have time to look at more rewarding areas of site optimisation or authority building – all well within Google guidelines of course.
Does KD matter? I don’t think so. Hmmmm I wonder if I used ‘KD’ enough in this blog post (Insert%) :)
QUOTE: What is the correct KD for Google? It is not really my thing because with blogs, if you have a keyword in the title, your KD changes depending on comments and trackbacks including the words.
If you don’t use a description, and even when you do, you quite often end up with the text for a trackback appearing in the snippet.
If you really want to maintain density, you can use a commenting system such as Disqus, but then your comments are hosted on a different domain, and you lose the benefit of the long tail and update frequency.
Rand Fishkin: ‘Modern Search Engine’s Have Never Used KD‘
QUOTE: Shaun – the truth is simply that modern search engines have never used KD. Look through any intro to information retrieval course in any university on the planet and you’ll see that it’s been debunked as a high-cost, low return metric. Instead, they use term weight – TF*IDF – check out some good work on the subject from Dr. Edel Garcia (one of the few information retrieval scientists whose crossed over into SEO):
The vast majority of Google SEO here think there’s no optimal KD percent. It does not matter if you use WordPress, Joomla or Drupal (or any cms) most modern search engines probably do not count KD when analysing a HTML web page, blog posts, title, headers or links, for ranking purposes. Instead of using tools to measure KD, think about keyword optimization in terms of keyword prominence, keyword proximity and co-occurring keywords in a document.
It’s worth noting what Google says on keyword stuffing.
There is a lot of SEO ‘advice’ out there. Anything from keeping it under a number like 10%, or greater than 1% – truth is, keyword density is probably an SEO myth, according to most professionals.