QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors.” Google, 2017
There is no one particular way to create web pages that successfully rank in Google but you must ensure:.
QUOTE: “that your content kind of stands on its own” John Mueller, Google 2015
If you have an optimised platform on which to publish it, high-quality content is the number 1 user experience area to focus on across websites to earn traffic from Google in 2018.
If you need help with optimising your website content, I can offer this as a service. If you want to learn how to achieve it yourself, read on.
Content quality is also the number one area to focus on if you are to avoid Google Panda demotion. Panda, we have been told, is part of Google’s core site quality rating algorithms in 2018.
Whatever that means, of course, to those unfamiliar with Google quality algorithm updates.
This article aims to cover the most significant challenges of writing ‘SEO-friendly’ text and web page copy for Google in 2018. High-quality content is one aspect of a high-quality site.
I discuss what’s in play, regarding what types of pages Google is ranking and the types of activity around this content that helps it rank in the first place.
It is NO LONGER about adding keywords to pages with 300 words of text. In fact, that practice can be toxic to a site.
Google clearly says that the practice of making your text more ‘unique’, using low-quality techniques like adding synonyms and related words is:
QUOTE: “probably more counter productive than actually helping your website” John Mueller, Google
Table Of Contents
Focus On The User
As Google says in their manifesto:
QUOTE: “Focus on the user and all else will follow.” Google
It is time to focus on the user when it comes to content marketing, and the bottom line is you need to publish unique content free from any low-quality signals if expect some sort of traction in Google SERPs (Search Engine Result Pages).
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2016
Those in every organisation with the responsibility of adding content to a website should understand these fundamental aspects about satisfying web content because the door is firmly closing on unsatisfying web content.
Low-quality content can severely impact the success of SEO, in 2018.
SEO copywriting is a bit of a dirty word – but the text on a page still requires optimised, using familiar methods, albeit published in a markedly different way than we, as SEO, used to get away with.
Optimise For User Intent & Satisfaction
QUOTE: “Basically you want to create high-quality sites following our webmaster guidelines, and focus on the user, try to answer the user, try to satisfy the user, and all eyes will follow.” Gary Illyes, Google 2016
When it comes to writing SEO-friendly text for Google, we must optimise for user intent, not simply what a user typed into Google.
Google will send people looking for information on a topic to the highest quality, relevant pages it knows about, often BEFORE it relies on how Google ‘used‘ to work e.g. relying on finding near or exact match instances of a keyword phrase on any one page, regardless of the actual ‘quality’ of that page.
Google is constantly evolving to better understand the context and intent of user behaviour, and it doesn’t mind rewriting the query used to serve high-quality pages to users that more comprehensively deliver on user satisfaction e.g. explore topics and concepts in a unique and satisfying way.
Focus on ‘Things’, Not ‘Strings’
QUOTE: “we’ve been working on an intelligent model—in geek-speak, a “graph”—that understands real-world entities and their relationships to one another: things, not strings” Google 2012
Google is better at working out what a page is about, and what it should be about to satisfy the intent of a searcher, and it isn’t relying only on keyword phrases on a page to do that anymore.
Google has a Knowledge Graph populated with NAMED ENTITIES and in certain circumstances, Google relies on such information to create SERPs.
Google has plenty of options when rewriting the query in a contextual way, based on what you searched for previously, who you are, how you searched and where you are at the time of the search.
Google Is Not Going To Rank Low-Quality Pages When It Has Better Options
QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google
If you have exact match instances of key phrases on low-quality pages, mostly these pages won’t have all the compound ingredients it takes to rank high in Google in 2018.
I was working this, long before I understood it partially enough to write anything about it.
Here is an example of taking a standard page that did not rank for years and then turning it into a topic-oriented resource page designed around a user’s intent and purpose:
Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.
Google Has Evolved and Content Marketing With It
QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google
Google does not work only the way it used to work, and as a result, this impacts a lot of websites built a certain way to rank high in Google – and Google is a lot less forgiving these days.
Is the user going to be more satisfied with an exact match query on a low-quality website, OR a high-quality page closely related to the search query used, published by an entity Google trusts and rates highly?
Google is deciding more and more to go with the latter.
Optimisation, in 2018, must not get in the way of the text or user experience.
Do not optimise for irrelevant keyword searches.
Do not keyword stuff.
The Importance of Unique Content For Your Website
QUOTE: “Duplicated content is often not manipulative and is commonplace on many websites and often free from malicious intent. Copied content can often be penalised algorithmically or manually. Duplicate content is not penalised, but this is often not an optimal set-up for pages, either. Be VERY careful ‘spinning’ ‘copied’ text to make it unique!” Hobo, 2018
From a quality page point of view, duplicate content (or rather, copied content) can a low-quality indicator.
Boilerplate (especially spun) text can be another low-quality indicator.
If your website is tarnished with these practices – it is going to be classed ‘low-quality’ by some part of the Google algorithm:
- If all you have on your page are indicators of low-quality – you have a low-quality page in 2018 – full stop.
- If your entire website is made up of pages like that, you have a low-quality website.
- If you have manipulative backlinks, then that’s a recipe for disaster.
What Are The High-Quality Characteristics of a Web Page?
The following are examples of what Google calls ‘high-quality characteristics’ of a page and should be remembered:
- “A satisfying or comprehensive amount of very high-quality” main content (MC)
- Copyright notifications up to date
- Functional page design
- Page author has Topical Authority
- High-Quality Main Content
- Positive Reputation or expertise of website or author (Google yourself)
- Very helpful SUPPLEMENTARY content “which improves the user experience.“
- Google wants to reward ‘expertise’ and ‘everyday expertise’ or experience so make this clear (Author Box?)
- Accurate information
- Ads can be at the top of your page as long as it does not distract from the main content on the page
- Highly satisfying website contact information
- Customised and very helpful 404 error pages
- Evidence of expertise
- Attention to detail
If Google can detect investment in time and labour on your site – there are indications that they will reward you for this (or at least – you won’t be affected when others are, meaning you rise in Google SERPs when others fall).
What Characteristics Do The Highest Quality Pages Exhibit?
You obviously want the highest quality ‘score’ but looking at the guide that is a lot of work to achieve.
Google wants to rate you on the effort you put into your website, and how satisfying a visit is to your pages.
- QUOTE: “Very high or highest quality MC, with demonstrated expertise, talent, and/or skill.“
- QUOTE: “Very high level of expertise, authoritativeness, and trustworthiness (page and website) on the topic of the page.”
- QUOTE: “Very good reputation (website or author) on the topic of the page.”
At least for competitive niches were Google intend to police this quality recommendation, Google wants to reward high-quality pages and “the Highest rating may be justified for pages with a satisfying or comprehensive amount of very high-quality” main content.
If your main content is very poor, with “grammar, spelling, capitalization, and punctuation errors“, or not helpful or trustworthy – ANYTHING that can be interpreted as a bad user experience – you can expect to get a low rating.
QUOTE: “We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low-quality (main content) do not achieve their purpose well.”
Note – not ALL thin pages are low-quality.
If you can satisfy the user with a page thin on content – you are ok (but probably susceptible to someone building a better page than your, more easily, I’d say).
This is a good article about long clicks.
Google expects more from big brands than they do from your store (but that does not mean you shouldn’t be aiming to meet ALL these high-quality guidelines above.
If you violate Google Webmaster recommendations for performance in their indexes of the web – you automatically get a low-quality rating.
Poor page design and poor main content and too many ads = you are toast.
If a rater is subject to a sneaky redirect – they are instructed to rate your site low.
What Are The Low-Quality Signals Google Looks For?
These include but are not limited to:
- Lots of spammy comments
- Low-quality content that lacks EAT signal (Expertise + Authority + Trust”)
- NO Added Value for users
- Poor page design
- Malicious harmful or deceptive practices detected
- Negative reputation
- Auto-generated content
- No website contact information
- Fakery or INACCURATE information
- Website not maintained
- Pages just created to link to others
- Pages lack purpose
- Keyword stuffing
- Inadequate customer service pages
- Sites that use practices Google doesn’t want you to use
Pages can get a neutral rating too.
Pages that have “Nothing wrong, but nothing special” about them don’t “display characteristics associated with a High rating” and puts you in the middle ground – probably not a sensible place to be a year or so down the line.
Pages Can Be Rated ‘Medium Quality’
Quality raters will rate content as medium rating when the author or entity responsible for it is unknown.
If you have multiple editors contributing to your site, you had better have a HIGH EDITORIAL STANDARD.
One could take from all this that Google Quality raters are out to get you if you manage to get past the algorithms, but equally, Google quality raters could be friends you just haven’t met yet.
Somebody must be getting rated highly, right?
Impress a Google Quality rater and get a high rating.
If you are a spammer you’ll be pulling out the stops to fake this, naturally, but this is a chance for real businesses to put their best foot forward and HELP quality raters correctly judge the size and relative quality of your business and website.
Real reputation is hard to fake – so if you have it – make sure its on your website and is EASY to access from contact and about pages.
The quality raters handbook is a good training guide for looking for links to disavow, too.
It’s pretty clear.
Google organic listings are reserved for ‘remarkable’ and reputable’ content, expertise and trusted businesses.
A high bar to meet – and one that is designed for you to never quite meet unless you are serious about competing, as there is so much work involved.
I think the inferred message is to call your Adwords rep if you are an unremarkable business.
Long Tail Traffic Is Hiding In Long Form Content
Google didn’t kill the long tail of traffic, though since 2010 they have been reducing the amount of such traffic they will send to certain sites.
In part, they shifted a lot of long tail visitors to pages that Google thought may satisfy their user query, RATHER than just rely on particular exact and near match keywords repeated on a particular page.
At the same time, Google was hitting old school SEO tactics and particularly thin or overlapping pages. So – an obvious strategy and one I took was to identify the thin content on a site and merge it into long-form content and then rework that to bring it all up-to-date.
Long-form content is a magnet for long tail searches and helps you rank for much more popular (head) keywords. The more searchers and visitors you attract, the more you can ‘satisfy’ and the better chance you can rank higher in the long run.
Do you NEED long-form pages to rank?
No – but it can be very useful as a base to start a content marketing strategy if you are looking to pick up links and social media shares.
And be careful. The longer a page is, the more you can dilute it for a specific keyword phrase, and its sometimes a challenge to keep it updated.
Google seems to penalise stale or unsatisfying content.
Google Demotion Algorithms Target Low-Quality Content
Optimising (without improving) low-quality content springs traps set by ever-improving core quality algorithms.
What this means is that ‘optimising’ low-quality pages is very much swimming upstream in 2018.
Optimising low-quality pages without value-add is self-defeating, now that the algorithms – and manual quality rating efforts – have got that stuff nailed down.
If you optimise low-quality pages using old school SEO techniques, you will be hit with a low-quality algorithm (like the Quality Update or Google Panda).
You must avoid boilerplate text, spun text or duplicate content when creating pages – or you are Panda Bamboo – as Google hinted at in the 2015 Quality Rater’s Guide.
QUOTE: “6.1 Low Quality Main Content One of the most important considerations in PQ rating is the quality of the MC. The quality of the MC is determined by how much time, effort, expertise, and talent/skill have gone into the creation of the page. Consider this example: Most students have to write papers for high school or college. Many students take shortcuts to save time and effort by doing one or more of the following:
- Buying papers online or getting someone else to write for them
- Making things up.
- Writing quickly with no drafts or editing.
- Filling the report with large pictures or other distracting content.
- Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there.
- Using commonly known facts, for example, “Argentina is a country. People live in Argentina. Argentina has borders. Some people like Argentina.”
- Using a lot of words to communicate only basic ideas or facts, for example, “Pandas eat bamboo. Pandas eat a lot of bamboo. It’s the best food for a Panda bear.”
Unfortunately, the content of some webpages is similarly created. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well. Important: Low quality MC is a sufficient reason to give a page a Low quality rating.”
Google rewards uniqueness or punishes the lack of it.
The number 1 way to do ‘SEO copywriting‘ in 2018 will be to edit the actual page copy to continually add unique content and improve its accuracy, uniqueness, relevance, succinctness, and use.
Low-Quality Content Is Not Meant To Rank High in Google.
A Google spokesmen said not that long ago that Google Panda was about preventing types of sites that shouldn’t rank for particular keywords from ranking for them.
QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.” Gary Illyes – Search Engine Land
When Google demotes your page for duplicate content practices, and there’s nothing left in the way of unique content to continue ranking you for – your web pages will mostly be ignored by Google.
The way I look at it – once Google strips away all the stuff it ignores (duplicate text) – what’s left? In effect, that’s what you can expect Google to reward you for. If what is left is boilerplate synonymised text content – that’s now being classed as web spam – or ‘spinning’.
NOTE – The ratio of duplicate content on any page is going to hurt you if you have more duplicate text than unique content. A simple check of the pages, page to page, on the site is all that’s needed to ensure each page is DIFFERENT (regarding text) page-to-page.
If you have large sections of duplicate text page-to-page – that is a problem that should be targeted and removed.
It is important to note:
- The main text content on the page must be unique to avoid Google’s page quality algorithms.
- Verbose text must NOT be created or spun automatically
- Text should NOT be optimised to a template as this just creates a footprint across many pages that can be interpreted as redundant or manipulative boilerplate text.
- Text should be HIGHLY descriptive, unique and concise
- If you have a lot of pages to address, the main priority is to create a UNIQUE couple of paragraphs of text, at least, for the MC (Main Content). Pages do not need thousands of words to rank. They just need to MEET A SPECIFIC USER INTENT and not TRIP ‘LOW_QUALTY’ FILTERS. A page with just a few sentences of unique text still meets this requirement (150-300 words) – for now.
- When it comes to out-competing competitor pages, you are going to have to look at what the top competing page is doing when it comes to main content text. Chances are – they have some unique text on the page. If they rank with duplicate text, either their SUPPLEMENTARY CONTENT is better, or the competitor domain has more RANKING ABILITY because of either GOOD BACKLINKS or BETTER USER EXPERIENCE.
- Updating content on a site should be a priority as Google rewards fresher content for certain searches.
Are Keywords – and Keyword Research – Dead?
No. Not while we use keywords to communicate with visitors. At this end of the user journey, we still need to build keyword ‘rich’ pages around topic and concepts. Instead of repeating keywords like we used to do to a ‘formula’, we now need to find keywords that make this page relevant for as many searches around a concept as possible.
You still need keywords to build a concept with, it is now more important to include terms with relationships to the main topic. In short – when someone lands on your page about a topic – are they satisfied with the scope and breadth of the information presented? If users are satisfied with a page regarding how it performs against competing pages, then Google will probably be satisfied.
Keyword research is still going to be important in optimising high-quality pages or planning a content marketing strategy.
My favorite keyword research tool is SEMRush.
Editorial & Informative In-depth Content
From a strategic point of view, if you can explore a topic or concept in an in-depth way you must do it before your competition. Especially if this is one of the only areas you can compete with them on.
Here are some things to remember about creating topic oriented in-depth content:
- In-depth content needs to be kept updated. Every six months, at least. If you can update it a lot more often than that – it should be updated more
- In-depth content can reach tens of thousands of words, but the aim should always be to make the page concise as possible, over time
- In-depth content can be ‘optimised’ in much the same way as content has always been optimised
- In-depth content can give you authority in your topical niche
- Pages must MEET THEIR PURPOSE WITHOUT DISTRACTING ADS OR CALL TO ACTIONS. If you are competing with an information page – put the information FRONT AND CENTRE. Yes – this impacts negatively on conversions in the short term. BUT – these are the pages Google will rank high. That is – pages that help users first and foremost complete WHY they are on the page (what you want them to do once you get them there needs to be of secondary consideration when it comes to Google organic traffic).
- You need to balance conversions with user satisfaction unless you don’t want to rank high in Google in 2018.
Here is the bad news about ‘quality content’. Even it, including in-depth articles, entropies.
This is, however, natural.
You cannot expect content to stay relevant and perform at the same levels six months or a year down the line.
Even high-quality content can lose positions to better sites and better, more up-to-date content.
In any competitive niche, you are going to be up against this kind of content warfare – and this is only going to get more competitive.
You must, in 2018, be investing in professionally created website content. You must create and keep updated unique, informative, trustworthy and authoritative editorial content.
Once you have keywords in place, you must focus on improving user experience and conversion optimisation.
How Much Text Do I Need To Write For Google?
How much text do you put on a page to rank for a certain keyword?
Well, as in so much of SEO theory and strategy, there is no optimal amount of text per page, and it is going to differ, based on the topic, and content type, and SERP you are competing in.
Instead of thinking about the quantity of the Main Content (MC) text, you should think more about the quality of the content on the page. Optimise this with searcher intent in mind.
There is no minimum amount of words or text to rank in Google. I have seen pages with 50 words out-rank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other ‘strategy’. Google is a lot better at hiding away those pages in 2018, though.
At the moment, I prefer long-form pages and a lot of text, still focused on a few related keywords and keyphrases to a page. Useful for long tail key phrases and easier to explore related terms.
Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a user’s search query.
I don’t care how many words I achieve this with and often I need to experiment on a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google.
One thing to note – the more text you add to the page, as long as it is unique, keyword rich and relevant to the topic, the more that page will be rewarded with more visitors from Google.
There is no optimal number of words on a page for placement in Google.
Every website – every page – is different from what I can see. Don’t worry too much about word count if your content is original and informative. Google will probably reward you on some level – at some point – if there is lots of unique text on all your pages.
Google said there is no minimum word count when it comes to gauging content quality.
QUOTE: “There’s no minimum length, and there’s no minimum number of articles a day that you have to post, nor even a minimum number of pages on a website. In most cases, quality is better than quantity. Our algorithms explicitly try to find and recommend websites that provide content that’s of high quality, unique, and compelling to users. Don’t fill your site with low-quality content, instead work on making sure that your site is the absolute best of its kind.” John Mueller Google, 2014
However, the quality rater’s guide does state:
6.2 Unsatisfying Amount of Main Content
Some Low quality pages are unsatisfying because they have a small amount of MC for the purpose of the page. For example, imagine an encyclopedia article with just a few paragraphs on a very broad topic such as World War II. Important: An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating.
How Many Words Can I Optimise A Page For?
This is going to depend on too many things to give you a general number.
You should probably not think like that, too much, though. A page should rank for the Head Terms (which are in the Title Tag) and the page copy itself should be optimised for as many other closely related keyword phrases as possible (all part of the natural long-tail of the Head Term.).
How Often Should Important Keywords Appear On Your Page?
You only need to mention related terms, and sometimes even Head Terms, ONCE in page content to have an impact on relevance and rank for lots of different long-tail key phrases. I still repeat phrases a few times on any page.
A perfect keyword density is a myth, but a demotion for stuffing keywords irresponsibly into text is not.
Keyword Stuffing is clearly against Google guidelines – so I avoid it in 2018.
Instead of thinking how many times to repeat keyword phrases, I’ve had success thinking the opposite over the last few years. That is, how few times can I repeat a primary key phrase on a page to rank it in Google.
It’s not how often to repeat keyword phrases; it is where they are placed throughout the document, and how prominent they are.
What Is Keyword Stuffing?
Keyword stuffing is simply the process of repeating the same keyword or key phrases over and over in a page. It’s counterproductive. It’s is a signpost of a very low-quality spam site and is something Google clearly recommends you avoid.
QUOTE: “the practice of loading a webpage with keywords in an attempt to manipulate a site’s ranking in Google’s search results“. Google
Keyword stuffing text makes your copy often unreadable and so, a bad user experience. It often gets a page booted out of Google but it depends on the intent and the trust and authority of a site. It’s sloppy SEO in 2018.
It is not a tactic you want to employ in search of long-term rankings.
Just because someone else is successfully doing it do not automatically think you will get away with it.
Don’t do it – there are better ways of ranking in Google without resorting to it.
I used to rank high for ‘keyword stuffing‘ years ago, albeit with a keyword stuffed page that I created to illustrate you could, at one time, blatantly ignore Google’s rules and, in fact, use them as a blueprint to spam Google.
I don’t keyword stuff in 2018 – Google is a little less forgiving than it used to be.
Google has a lot of manual raters who rate the quality of your pages, and keyword stuffed main content features prominently in the 2017 Quality Raters Guidelines:
7.4.2 “Keyword Stuffed” Main Content
QUOTE: ‘Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.’ Search Quality Raters Guidelines March 14, 2017
Good Content Will Still Need ‘Optimised’
The issue is, original “compelling content” – so easy to create isn’t it(!) – on a site with no links and no audience and no online business authority is as useful as boring, useless content – to Google – and will be treated as such by Google – except for long tail terms (if even).
It usually won’t be found by many people and won’t be READ and won’t be ACTED upon – not without a few good links pointing to the site – NOT if there is any competition for the term.
Generalisations make for excellent link bait and while good, rich content is very important, sayings like ‘content is everything’ is not telling you the whole story.
The fact is – every single site is different, sits in a niche with a different level of competition for every keyword or traffic stream, and needs a strategy to tackle this.
There’s no one size fits all magic button to press to get traffic to a site. Some folk have a lot of domain authority to work with, some know the right people, or have access to an audience already – indeed, all they might need is a copywriter – or indeed, some inspiration for a blog post.
They, however, are in the minority of sites.
Most of the clients I work with have nothing to start with and are in a relatively ‘boring’ niche few reputable blogs write about.
In one respect, Google doesn’t even CARE what content you have on your site (although it’s better these days at hiding this).
Humans do care, of course, so at some point, you will need to produce that content on your pages.
You Can ALWAYS Optimise Content To Perform Better In Google
An SEO can always get more out of content in organic search than any copywriter, but there’s not much more powerful than a copywriter who can lightly optimise a page around a topic, or an expert in a topic that knows how to – continually, over time – optimise a page for high rankings in Google.
If I wanted to rank for “How To Write For Google“? – for instance – in the old days you used to put the key phrase in the normal elements like the Page Title Element and ALT text and then keyword stuffed your text to make sure you repeated “How To Write For Google” enough times in a block of low-quality text.
Using variants and synonyms of this phrase helped to add to the ‘uniqueness’ of the page, of course.
Throwing in any old text would beef the word count up.
Now, in 2018, if I want to rank high in Google for that kind of term – I would still rely on old SEO best practices like a very focused page title – but now the text should explore a topic in a much more informative way.
Writing for Google and meeting the query intent means an SEO copywriter would need to make sure page text included ENTITIES AND CONCEPTS related to the MAIN TOPIC of the page you are writing about and the key phrase you are talking about.
If I wanted a page to rank for this term, I would probably need to explore concepts like Google Hummingbird, Query Substitution, Query Reformation and Semantic Search i.e. I need to explore a topic or concept in fully – and as time goes on – more succinctly – than competing pages.
If you want to rank for a SPECIFIC search term – you can still do it using the same old, well-practiced keyword targeting practices. The main page content itself just needs to be high-quality enough to satisfy Google’s quality algorithms in the first place.
This is still a land grab.
Should I Rewrite Product Descriptions To Make The Text Unique?
Whatever you do, beware ‘spinning the text’ – Google might have an algorithm or two focused on that sort of thing:
How Does Google Rate ‘Copied’ Main Content?
This is where you are swimming upstream in 2018. Copied content is not going to be a long-term strategy when creating a unique page better than your competitions’ pages.
In the latest Google Search Quality Evaluator Guidelines that were published on 14 March 2017, Google states:
7.4.5 Copied Main Content
Every page needs MC. One way to create MC with no time, effort, or expertise is to copy it from another source. Important: We do not consider legitimately licensed or syndicated content to be “copied” (see here for more on web syndication). Examples of syndicated content in the U.S. include news articles by AP or Reuters.
The word “copied” refers to the practice of “scraping” content, or copying content from other nonaffiliated websites without adding any original content or value to users (see here for more information on copied or scraped content).
If all or most of the MC on the page is copied, think about the purpose of the page. Why does the page exist? What value does the page have for users? Why should users look at the page with copied content instead of the original source? Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
7.4.6 More About Copied Content
All of the following are considered copied content:
● Content copied exactly from an identifiable source. Sometimes an entire page is copied, and sometimes just parts of the page are copied. Sometimes multiple pages are copied and then pasted together into a single page. Text that has been copied exactly is usually the easiest type of copied content to identify.
● Content which is copied, but changed slightly from the original. This type of copying makes it difficult to find the exact matching original source. Sometimes just a few words are changed, or whole sentences are changed, or a “find and replace” modification is made, where one word is replaced with another throughout the text. These types of changes are deliberately done to make it difficult to find the original source of the content. We call this kind of content “copied with minimal alteration.”
● Content copied from a changing source, such as a search results page or news feed. You often will not be able to find an exact matching original source if it is a copy of “dynamic” content (content which changes frequently). However, we will still consider this to be copied content. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
Tip: Beware Keyword Stuffing E-commerce Website Category Pages To Rank For Various Other Keywords in Google
Google’s John Meuller just helped someone out in this week’s Google Webmaster Hangout, and his answer was very interesting:
QUOTE: “The site was ranking the first page for the keyword (widget) and(widgets) in Australia since two weeks we moved all the way down to page five. Technical changes haven’t been made to the site the only modification was we added more category landing text to rank for various other (keywords)“
QUOTE: “the modification that you mentioned (above) that you put more category landing text on the page that might also be something that’s playing a role there. What I see a lot with e-commerce sites is that they take a category page that’s actually pretty good and they stuff a whole bunch of text on the bottom and that’s essentially just kind of roughly related to that content which is essentially like bigger than the Wikipedia page on that topic and from our point of view when we look at things like that our algorithms kind of quickly kind of back off and say whoa it looks like someone is just trying to use keyword stuffing to include a bunch of kind of unrelated content into the same page and then our algorithms might be a bit more critical and kind of like be cautious with regards to the content that we find on this page so that’s one thing to kind of watch out for.
I think it’s good to / help provide more context to things that you have on your website but kind of be reasonable and think about what users would actually use and focus on that kind of content so for example if if the bottom of these pages is just a collection of keywords and a collection of sentences where those keywords are artificially used then probably users aren’t going to scroll to the bottom and read all of that tiny text and actually use that content in a useful way and then probably search engines are also going to back off and say well this page is is doing some crazy stuff here we don’t really know how much we can trust the content on the page.”
If you are keyword stuffing e-commerce category pages, watch out. Google tells us these things for a reason. Adding optimised text to e-commerce category pages ‘just for the sake of it’ is probably going to work against you (and might be working against you today).
Keyword stuffing has been against the rules for a long time.
John previously stated back during 2016:
QUOTE: “if we see that things like keyword stuffing are happening on a page, then we’ll try to ignore that, and just focus on the rest of the page”.
Google has algorithms AND human reviewers looking out for it when the maths miss it:
7.4.2 “Keyword Stuffed” Main Content
QUOTE: ‘Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.’ Search Quality Raters Guidelines March 14, 2017
While there is obviously a balance to be had in this area, Google classes keyword stuffing as adding ‘irrelevant keywords‘ to your site. There are warnings also about this age-old SEO technique in the general webmaster guidelines:
General Guidelines: Irrelevant Keywords
QUOTE: “Keyword stuffing” refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site’s ranking in Google search results. Often these keywords appear in a list or group, or out of context (not as natural prose). Filling pages with keywords or numbers results in a negative user experience, and can harm your site’s ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.
Examples of keyword stuffing include:
- Lists of phone numbers without substantial added value
- Blocks of text listing cities and states a webpage is trying to rank for
- Repeating the same words or phrases so often that it sounds unnatural, for example:We sell custom cigar humidors. Our custom cigar humidors are handmade. If you’re thinking of buying a custom cigar humidor, please contact our custom cigar humidor specialists at firstname.lastname@example.org.
Can I Just Write Naturally and Rank High in Google?
Yes, you must write naturally (and succinctly) in 2018, but if you have no idea the keywords you are targeting, and no expertise in the topic, you will be left behind those that can access this experience.
You can just ‘write naturally’ and still rank, albeit for fewer keywords than you would have if you optimised the page.
There are too many competing pages targeting the top spots not to optimise your content.
Naturally, how much text you need to write, how much you need to work into it, and where you ultimately rank, is going to depend on the domain reputation of the site you are publishing the article on.
Do You Need Lots of Text To Rank Pages In Google?
User search intent is a way marketers describe what a user wants to accomplish when they perform a Google search.
SEOs have understood user search intent to fall broadly into the following categories and there is an excellent post on Moz about this.
- Transactional – The user wants to do something like buy, signup, register to complete a task they have in mind.
- Informational – The user wishes to learn something
- Navigational – The user knows where they are going
The Google human quality rater guidelines modify these to simpler constructs:
SO – how do you optimise for all this?
You could rely on old-school SEO techniques, but Google doesn’t like thin pages in 2018, and you need higher quality unnatural links to power low-quality sites these days. That is all a risky investment.
Google has successfully made that way forward a minefield for smaller businesses in 2018.
A safer route with a guaranteed ROI, for a real business who can’t risk spamming Google, is to focus on satisfying user satisfaction signals Google might be rating favourably.
How a user searches is going to be numerous and ambiguous and this is an advantage for the page that balances this out better than competing pages in SERPs.
High-quality copywriting is not an easy ‘ask’ for every business, but it is a tremendous leveler.
Anyone can teach what they know and put it on a website if the will is there.
Some understand the ranking benefits of in-depth, curated content, for instance, that helps a user learn something. In-depth pages or long-form content is a magnet for long-tail key phrases.
High-quality text content of any natu re is going to do well, in time, and copywriters should rejoice.
Copywriting has never been more important.
Offering high-quality content is a great place to start on your site.
It’s easy for Google to analyse and rate, and it is also a sufficient barrier to entry for most competitors (at least, it was in the last few years).
Google is looking for high-quality content:
“High quality pages and websites need enough expertise to beauthoritative and trustworthy on their topic.”
..or if you want it another way, Google’s algorithms target low-quality content.
But what if you can’t write to satisfy these KNOW satisfaction metrics?
Luckily – you do not need lots of text to rank in Google.
When a user is actively seeking your page out and selects your page in the SERP, they are probably training Google AI to understand this is a page on a site that satisfies the user intent. This user behaviour is where traditional media and social media promotion is going to be valuable if you can get people to search your site out. This is one reason you should have a short, memorable domain if you can get one.
So, users should be using Google to seek your site out.
‘Do’ Beats ‘Know.’
If you can’t display E.A.T. in your writing, you can still rank if you satisfy users who do search that query.
Last year I observed Google rank a page with 50 words of text on it instead of a page with 5000 words and lots of unique images that target the same term on the same domain.
While there might be something at fault with the ‘optimised’ 5000-word page I have overlooked, the main difference between the two pages was time spent on the page and task completion ‘rate’.
I’ve witnessed Google flip pages on the same domain for many reasons, But it did get me thinking perhaps that Google is thinking users are more satisfied with the DO page (an online tool) with better task completion metrics than the KNOW page (a traditional informational page).
In the end, I don’t need to know why Google is flipping the page, just that it is.
So that means that you don’t always need ‘text-heavy content’ to rank for a term.
You never have of course.
I only offer one example I’ve witnessed Google picking the DO page over the KNOW page, and it surprised me when it did.
It has evidently surprised others too.
There is a post on Searchmetrics that touches on pages with only a little text-content ranking high in Google:
QUOTE: “From a classical SEO perspective, these rankings can hardly be explained. There is only one possible explanation: user intent. If someone is searching for “how to write a sentence” and finds a game such as this, then the user intention is fulfilled. Also the type of content (interactive game) has a well above average time-on-site. SearchMetrics 2016
That’s exactly what I think I have observed, too, though I wouldn’t ever say there is only ‘one possible explanation‘ to anything to do with Google.
For instance – perhaps other pages on the site help the page with no content rank, but when it comes to users being satisfied, Google shows the page with better usage statistics instead, because it thinks it is a win for everyone involved.
This is speculation, of course, and I have witnessed Google flipping pages in SERPs if they have a problem with one of them, for instance, for years.
This is news in January 2016 but I saw it last year and some time ago now. It isn’t entirely ‘new’ to the wild but might be more noticeable in more niches in 2018.
How Much Text Do You Need To Rank?
None, evidently, if you can satisfy the query in an unusual manner without the text.
Are Poor Spelling and Bad Grammar Google Ranking Factors?
Is Grammar A Ranking Factor?
NO – this is evidently NOT a ranking signal. I’ve been blogging for nine years and most complaints I’ve had in that time have been about my poor grammar and spelling in my posts.
My spelling and grammar may be atrocious but these shortcomings haven’t stopped me ranking lots of pages over the years.
Google historically has looked for ‘exact match’ instances of keyword phrases on documents and SEO have, historically, been able to optimise successfully for these keyword phrases – whether they are grammatically correct or not.
So how could bad grammar carry negative weight in Google’s algorithms?
That being said, I do have Grammarly, a spelling and grammar checking plugin installed on my browser to help me catch the obvious mistakes.
Advice From Google
John Mueller from Google said in a recent hangout that it was ‘not really’ but that it was ‘possible‘ but very ‘niche‘ if at all, that grammar was a positive ranking factor. Bear in mind – most of Google’s algorithms (we think) demote or de-rank content once it is analysed – not necessarily promote it – not unless users prefer it.
Another video I found is a Google spokesman talking about inadequate grammar as a ranking factor or page quality signal was from a few years ago.
In this video, we are told, by Google, that grammar is NOT a ranking factor.
Not, at least, one of the 200+ quality signals Google uses to rank pages.
And that rings true, I think.
Google’s Matt Cutts did say though:
QUOTE: “It would be fair to use it as a signal…The more reputable pages do tend to have better grammar and better spelling. ” Matt Cutts, Google
Google Panda & Content Quality
Google is on record as saying (metaphorically speaking) their algorithms are looking for signals of low quality when it comes to rating pages on Content Quality.
Some possible examples could include:
QUOTE: “1. Does this article have spelling, stylistic, or factual errors?”
QUOTE: “2. Was the article edited well, or does it appear sloppy or hastily produced?”
QUOTE: “3. Are the pages produced with great care and attention to detail vs. less attention to detail?”
QUOTE: “4. Would you expect to see this article in a printed magazine, encyclopedia or book?”
Altogether – Google is rating content on overall user experience as it defines and rates it, and bad grammar and spelling equal a poor user experience.
At least on some occasions.
Google aims to ensure organic search engine marketing be a significant investment in time and budget for businesses. Critics will say to make Adwords a more attractive proposition.
Google aims to reward quality signals that:
- take time to build and
- the vast majority of sites will not, or cannot meet without a lot of investment.
NO website in a competitive market gets valuable traffic from Google in 2018 without a lot of work. Technical work and content curation.
It’s an interesting aside.
Fixing the grammar and spelling on a page can be a time-consuming process.
It’s clearly a machine-readable and detectable – although potentially noisy – signal and Google IS banging on about Primary MAIN Content Quality and User Experience.
Grammar is ranking factor could be one for the future – but at the moment, I doubt grammar is taken much into account (on an algorithmic level, at least, although users might not like your grammar and that could have a second order impact if it causes high abandonment rates, for instance).
Is Spelling A Google Ranking Factor?
Poor spelling has always had the potential to be a NEGATIVE ranking factor in Google. IF the word that is incorrect on the page is unique on the page and of critical importance to the search query.
Although – back in the day – if you wanted to rank for misspellings – you optimised for them – so – poor spelling would be a POSITIVE ranking looking back not that long ago.
Now, that kind of optimisation effort is fruitless, with changes to how Google presents these results in 2018.
Google will favour “Showing results for” results over presenting SERPs based on a common spelling error.
Testing to see if ‘bad spelling’ is a ranking factor is still easy on a granular level, bad grammar is not so easy to test.
I think Google has better signals to play with than ranking pages on spelling and grammar. It’s not likely to penalise you for the honest mistakes most pages exhibit, especially if you have met more important quality signals – like useful main content.
And I’ve seen clear evidence of pages ranking very well with both bad spelling and bad grammar. My own!
I still have Grammarly installed, though.
How Do You Improve Web Content in 2018?
This is no longer about repeating keywords. ANYTHING you do to improve the page is going to be a potential SEO benefit. That could be:
- Fixing poor grammar and spelling mistakes
- Adding synonyms to text
- Reducing keyword stuffing
- Reducing the ratio of duplicated text on your page to unique text
- Removing old outdated links or out-of-date content
- Re-Wording sentences to take out sales or marketing fluff and focusing more on the USER INTENT (e.g. give them the facts first including pros and cons – for instance – through reviews) and purpose of the page.
- Merging many old pages into one, fresh page
- Conciseness, while still maximising relevance and keyword coverage
- Keyword phrase prominence throughout the copy (you can have too much, or too little, and it is going to take testing to find out what is the optimal presentation will be)
- Topic modeling
A great writer can get away with fluff but the rest of us probably should focus on being concise.
Low-quality fluff is easily discounted by Google these days – and can leave a toxic footprint on a website.
Google Featured Snippets: Optimising For The New Opportunity
QUOTE: “When a user asks a question in Google Search, we might show a search result in a special featured snippet block at the top of the search results page. This featured snippet block includes a summary of the answer, extracted from a webpage, plus a link to the page, the page title and URL” Google 2018
Any content strategy in 2018 should naturally be focused on creating high-quality content and also revolve around triggering Google FEATURED SNIPPETS that trigger when Google wants them to – and intermittently – depending on the nature of the query.
Regarding the above image, where a page on Hobo is promoted to number 1 – I used traditional competitor keyword research and old-school keyword analysis and keyword phrase selection, albeit focused on the opportunity in long-form content, to accomplish that, proving that you still use this keyword research experience to rank a page in 2018.
Despite all the obfuscation, time delay, keyword rewriting, manual rating and selection bias Google goes through to match pages to keyword queries to, you still need to optimise a page to rank in a niche, and if you do it sensibly, you unlock a wealth of long-tail traffic over time (a lot of which is useless as it always was, but what RankBrain might clean up given time).
- Google is only going to produce more of these direct answers or answer boxes in future (they have been moving in this direction since 2005).
- Focusing on triggering these will focus your content creators on creating exactly the type of pages Google wants to rank. “HOW TO” guides and “WHAT IS” guides is IDEAL and the VERY BEST type of content for this exercise.
- Google is REALLY rewarding these articles in 2018 – and the search engine is VERY probably going to keep doing so for the future.
- Google Knowledge Graph offers another exciting opportunity – and indicates the next stage in organic search.
- Google is producing these ANSWER BOXES that can promote a page from anywhere on the front page of Google to number 1.
- All in-depth content strategy on your site should be focused on this new aspect of Google Optimisation. The bonus is you physically create content that Google is ranking very well in 2018 even without taking knowledge boxes into consideration.
- Basically – you are feeding Google EASY ANSWERS to scrape from your page. This all ties together very nicely with organic link building. The MORE ANSWER BOXES you UNLOCK – the more chance you have of ranking number one FOR MORE AND MORE TERMS – and as a result – more and more people see your utilitarian content and as a result – you get social shares and links if people care at all about it.
- You can share an Enhanced Snippet (or Google Answer Box as they were first called by SEOs). Sometimes you are featured and sometimes it is a competitor URL. All you can do in this case is to continue to improve the page until you squeeze your competitor out.
We already know that Google likes ‘tips’ and “how to” and expanded FAQ but this Knowledge Graph ANSWER BOX system provides a real opportunity and is CERTAINLY what any content strategy should be focused around to maximise exposure of your business in organic searches.
Unfortunately, this is a double-edged sword if you take a long-term view. Google is, after all, looking for easy answers so, eventually, it might not need to send visitors to your page.
To be fair, these Google Enhanced Snippets, at the moment, appear complete with a reference link to your page and can positively impact traffic to the page. SO – for the moment – it’s an opportunity to take advantage of.
How Long Does It Take To Write Seo-Friendly Copy?
Embarrassingly long, in some cases. I am not a professionally trained copywriter, though. I know how to rank pages, and the type of content it takes. The truth is in some verticals; you are writing, from scratch for ONE PAGE (as it needs to be unique), the same amount of text that would have optimised an entire 10,000-page website, just a few years ago. Why? Because your competition will do it, and Google will send them traffic.
I do not see many people pointing that out.
Ranking in Google organic listings is an investment in search engine optimisation AND, at least, one other multi-discipline (of which, there are many); creative copywriting and teaching, or design or engineering, for instance. Copywriting, if you are competing in an informational SERP; Teaching, if you are trying to meet Google’s E.A.T. requirements; design and engineering if you are focusing on satisfying specific user intent (e.g. offering tools, instead of content, for users).
These costs are in 2018 are relatively cost-prohibitive for smaller businesses (who can always choose Google Adwords), but Google is evidently giving the individual, any individual, rather than the business, the opportunity to rank in certain niches who has the drive, knowledge and time to invest in some machine detectable time-spent endeavour that satisfies users better than competing pages.
Businesses can still, of course, compete.
Note the importance of expertise in a chosen subject matter:
6.3 Lacking Expertise, Authoritativeness, or Trustworthiness (E-A-T)
QUOTE: “Some topics demand expertise for the content to be considered trustworthy. YMYL topics such as medical advice, legal advice, financial advice, etc. should come from authoritative sources in those fields. Even everyday topics, such as recipes and housecleaning, should come from those with experience and everyday expertise in order for the page to be trustworthy. You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.”
Content like this will not come cheap in the future.
The first thing to do is always look to your site for content that is under-performing. Low-quality pages should be reworked, redirected or removed. This is normally the first thing we aim to do in any project we are employed in.
Often, content can be repackaged into a more compelling, user-focused topic based article that explores a topic and frequently asked questions about it. This topic page can be optimised with synonyms and related keywords, entities and concepts.
Ranking in Google is not about repeating keywords anymore – in fact – all efforts down that road will only lead to failure.
Ranking high in Google is about exploring a ‘CONCEPT’ on a web page in a way that HELPS INFORM users while still meeting Google’s keyword-based and entity based relevancy signals.
Specific Advice From Google On Low-Quality Content On Your Site
And remember the following, specific advice from Google on removing low-quality content on a domain:
******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******
They are not lying. Here is another example of taking multiple pages and making one better high-quality page in place of them:
Optimising For Topics And Concepts
Old SEO was, to a large extent, about repeating text. New SEO is about user satisfaction.
Google’s punishment algorithms designed to target SEO are all over that practice these days. And over a lot more, to boot, in 2018.
- Google’s looking for original text on a subject matter that explores the concept that the page is about, rather than meets keyword relevance standards of yesteryear.
- If your page rings these Google bells in your favour, Google knows your page is a good resource on anything to do with a particular concept – and will send visitors to it after invisibly rewriting the actual search query that the user made. Google is obfuscating the entire user intent journey.
For us at the receiving end, it all boils down to writing content that meets a specific user intent and does it better than competing pages.
We are not trying to beat Google or RankBrain, just the competition.
Pages looking to, genuinely, help people are a good user experience. At the page level, satisfying informational search intent is still going to be about keyword analysis at some level.
SEO is about understanding topics and concepts as search engines try to.
A well-optimised topic/concept oriented page that meets high relevance signals cannot really fail to pick up search traffic and, if it’s useful to people, pick up UX signals that will improve rankings in Google (I include links in that).
What Topic Are You Building Machine Identifiable Expertise in?
People don’t know what to blog about, so often, a blog becomes a jumbled mess of un-directed, thin content. You prevent this from happening going forward by knowing what topic you are building out on your website. If everything you publish is related, you are probably building authority in that niche.
If all you are building is content that is immediately out of date and found on other more authoritative websites, you are toast. Keep that stuff for social media, or minimise its footprint on your blog.
EXAMPLE: You could write 30 unrelated blog posts. Once they were all published, you could combine them all into a list that was relevant to a topic or entity. What if these 30 seemly unlinked posts once combined, gave you the 30 things Tom Cruise has spent his money on? All of a sudden, this content becomes relevant in an entirely different way that it was when it was packaged separately in an unconnected and uninteresting way.
That is a wild example. I am just trying to illustrate you don’t build in-depth content in one go unless you have a team of ghostwriters. Success is often built upon a lot of small edits to pages over time.
If you haven’t been building a topical identity on your site, you should start now.
Topic & Concept Optimisation
This is how I categorise things (and have done for many years).
- Is your website about “topic”?
- Is your business operate within this “topic”?
- Are particular pages on your site about ‘sub-topics’ off the main ‘topic.’
- Do the pages explain the ‘sub-topic’ in detail, including ‘concepts’ within the main sub-topic area?
- Does your content explain concepts in a way that demonstrates experience or expertise?
- Does the page satisfy the user intent better than competing pages?
- Do multiple pages demonstrate expertise or experience in in-depth ways?
- Does this content have independent links from other ‘topic’ related authorities and entities?
That kind of ‘old-SEO’ still works fine.
Google is measuring how satisfied users are with the pages on your website, in comparison with competing pages in the SERP.
Google might still be interested in the reputation of individual authors.
Google Authorship (a push by Google to identify and highlight expert authors using special markup code in your HTML) used to be useful in a number of ways, not limited to achieving increased visibility in SERPs for your writing.
Google Authorship pictures were completely removed from the search results in the UK on June 2014 – but I still have my redundant Google authorship markup in place. SEOs expect authorship analysis in some shape or form to be impactful in Google ranking (it is in Google In-depth Articles, we have been told.)
I imagine Google has moved beyond this now, though, in web search, but is probably still interested in identifying expert authors. We know they use, at least, manual raters to evaluate E.A.T. (Expertise, Authority and Trust”).
There are many signals that no doubt helps Google work this out in 2018, perhaps through an author’s semantic association with certain entities and topics.
Optimising For User Intent Satisfaction
Brand Searches Imply Trust & Quality
A patent granted to Google tells us explicitly what features it was looking for in a site that might seem to indicate that the site was a quality site.
It tells us:
The score is determined from quantities indicating user actions of seeking out and preferring particular sites and the resources found in particular sites. A site quality score for a particular site can be determined by computing a ratio of a numerator that represents user interest in the site as reflected in user queries directed to the site (e.g. ‘Hobo’) and a denominator that represents user interest in the resources found in the site as responses to queries of all kinds (e.g. ‘SEO tutorial’). The site quality score for a site can be used as a signal to rank resources, or to rank search results that identify resources, that are found in one site relative to resources found in another site.
A forward-thinking strategy must take this into consideration. This understanding of how Google is moving to rank pages in future is not something that is going away anytime soon.
Note, also, that even if the above patent is in operation, or something like it, SERP modification will still likely be influenced by geographic location, to give you only one example.
- GOAL – Meet “user actions of seeking out and preferring particular sites” Have a brandable website or section of the website that would suit this type of search.
- GOAL – Meet “resources found in particular sites” – Generate content that meets this goal. In-depth content that will meet keyword requirements. Tools that help perform a user requirement, keeping in mind this will be judged “relative to resources found in another site” e.g. your competition and their competing pages.
- GOAL – Create secondary content that transforms gracefully to other media with an intent that informs the user to your service or product. Videos, Animations, Infographics. Other relevant media.
- GOAL – Improve brand signal from a technical SEO point of view. Optimisation and integration tasks with Google properties, for example (e.g. Google Local Business).
- GOAL – Improve user experience. Optimise the search experience.
- GOAL – Conversion Optimisation. Critical to try and convert more sales from the current traffic.
Optimising For The Long Click
When it comes to rating user satisfaction, there are a few theories doing the rounds at the moment that I think is sensible. Google could be tracking user satisfaction by proxy. When a user uses Google to search for something, user behaviour from that point on can be a proxy for the relevance and relative quality of the actual SERP.
This was recently leaked from Google itself:
“So when search was invented, like when Google was invented many years ago, they wrote heuristics that had figure out what the relationship between a search and the best page for that search was. And those heuristics worked pretty well and continue to work pretty well.”
“But Google is now integrating machine learning into that process. So then training models on when someone clicks on a page and stays on that page, when they go back or when they are trying to figure out exactly on that relationship. So search is getting better and better and better because of advances in machine learning.” Head Google Brain Toronto/Canada via SERoundtable
What is a Long Click?
A user clicks a result and spends time on it, sometimes terminating the search.
What is a Short Click?
A user clicks a result and bounces back to the SERP, pogo-sticking between other results until a long click is observed. Google has this information if it wants to use it as a proxy for query satisfaction.
For more on this, I recommend this article on the time to long click.
Optimise Supplementary Content on the Page
Once you have the content, you need to think about supplementary content and secondary links that help users on their journey of discovery.
That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic – you should be LINKING OUT to other helpful resources e.g. other websites.A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.
A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.
- TASK – On informational pages, LINK OUT to related pages on other sites AND on other pages on your own website where RELEVANT
- TASK – For e-commerce pages, ADD RELATED PRODUCTS.
- TASK – Create In-depth Content Pieces
- TASK – Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links
- TASK – Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject
- TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
Does Google Prefer Fresh Content?
We know that Google likes ‘fresh content’ if the query deserves fresh results.
Updating content on your site may telegraph to Google signals that the site is being maintained, and have second-order benefits.
Telling users when website copy was last edited is a probable ‘good user experience’, for instance.
Click Metrics and Task Completion Rate are LIKELY Ranking Modifiers
Are people actively seeking your website out, and when they do land there, are they satisfied?
e.g.using the Hobo site as an example, Is anyone looking for “Hobo SEO tutorial” and when they find it, are the completing the task? e.g. are they satisfied, do they hang around, do they share it. Is the page fit for purpose? Or do they immediately go back to Google and click on another result?
If there is a high satisfaction rate when people search for ‘hobo SEO tutorial’ compared to averages for ‘SEO tutorial’, then perhaps Google would be confident of actually showing that page in the mix for the actual term ‘SEO tutorial’ e.g. you would not need links. If there is a low satisfaction rate for such a keyword, then the page can be deemed ‘low-quality’ when compared to better-performing pages on competing sites.
When you satisfy a query in terms of quality and click through satisfaction – or make Google look good – you can actually ‘win’ a branded recognition for said valuable term.
We had the following brand highlight in the UK (for a few months, after our own content strategy hit the front page of Reddit last year).
I used to think Pagerank was like having a ticket to the party. It still might be on some level, but at some level keywords on page plays a similar role. Keywords used to be everything even a few years ago – but that has been at least obfuscated.
We tested this recently and was quite surprised by the difference e.g. you can’t simply add new words to a high authority page and expect it to rank just because you have ‘domain authority’ – not if there are a lot of higher quality pages targeting the term.
Site quality scores are one thing, but these site scores are tied to a KEYWORD query at EVERY TURN (or they are supposed to be). If you mention a SPECIFIC KEYWORD in a multi-keyword search, the HIGHEST QUALITY page for that SPECIFIC KEYWORD will often be returned, even with an irrelevant word thrown in. (But not two or more, apparently, as I noted in recent tests. Google is built on previous incarnations of how Google mapped the original web. e.g. for the longtail, you still NEED keywords on a page. ergo – the more relevant UNIQUE KEYWORDS (not key phrases) on a page, the more long tail searches it WILL appear for.)
e.g. you can have a very high-quality site, but a site with no proper keyword usage throughout the site won’t rank for any long tail searches e.g., how you are found in Google if you have a new site.
- TASK – [BRAND] + [PRIMARY KEYWORD] GOALS
- PROJECT GOAL – Get customers to search for your ‘brand’ and the ‘primary keyword’.
- PROJECT TASK – Build an internal information based page optimised for the ‘primary keyword’, with as many unique words on it as possible. Cite references. Have ‘Author’ with link to Bio that illustrates E.A.T. (Expertise, Authority and trust). Have ‘Last Updated’ on page (specific to page). Keep updated.
- PROJECT GOAL – Keep pages targeting the primary keyword to a minimum on the site. Only high-quality pages should target the primary keyword.
- PROJECT GOAL – Get legitimate reviews. Google Local Business reviews is a safe bet. Do not ASK for positive reviews on any third party review site… in any obvious manner.
- PROJECT TASK – Determine primary keywords to target with resources on site
- PROJECT GOAL – Get customers to search for your ‘brand’ and the ‘primary keyword’.
Low-Quality Content On Part of A Web Site Can Affect Rankings For The Same Website On More Important Keyword Rankings
Moz has a good video on the Google organic quality score theory. You should watch it. It goes into a lot of stuff I (and others) have been blogging for the last few years, and some of it is relevant to the audits I produce, an example of which you can see here (Alert! 2mb in size!).
One thing that could have been explained better in the video was that Moz has topical authority world wide for ‘Google SEO’ terms, hence why they can rank so easily for ‘organic quality score’.
But the explanation of the quality score is a good introduction for beginners.
I am in the camp this organic quality score has been in place for a long time, and more and more are feeling the results from it.
This is also quite relevant to a question answered last week in the Google Webmaster Hangout which was:
“QUESTION – Is it possible that if the algorithm doesn’t particularly like our blog articles as much that it could affect our ranking and quality score on the core Content?”
resulting in an answer:
“ANSWER: JOHN MUELLER (GOOGLE): Theoretically, that’s possible. I mean it’s kind of like we look at your web site overall. And if there’s this big chunk of content here or this big chunk kind of important wise of your content, there that looks really iffy, then that kind of reflects across the overall picture of your website. But I don’t know in your case, if it’s really a situation that your blog is really terrible.”
Google has introduced (at least) a ‘percieved’ risk to publishing lots of lower-quality pages on your site to in an effort to curb production of old-style SEO friendly content based on manipulating early search engine algorithms.
We are dealing with algorithms designed to target old style SEO – that focus around the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of Keywords equals LOTS of Google traffic.
A big site can’t just get away with publishing LOTS of lower quality content in the cavalier way they used to – not without the ‘fear’ of primary content being impacted and organic search traffic throttled negatively to important pages on the site.
Google is very probably using user metrics in some way to determine the ‘quality’ of your site.
QUESTION – “I mean, would you recommend going back through articles that we posted and if there’s ones that we don’t necessarily think are great articles, that we just take them away and delete them?”
The reply was:
JOHN MUELLER: I think that’s always an option.Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.
Deleting content is not always the optimal way to handle MANY types of low-quality content – far from it, in fact. Nuking it is the last option unless the pages really are ‘dead‘ content.
Any clean-up should go hand in hand with giving Google something it is going to value on your site e.g. NEW high-quality content:
The final piece of advice is interesting, too.
It gives us an insight into how Google might actually deal with your site:
JOHN MUELLER: “Then maybe that’s something where you can collect some metrics and say, well, everything that’s below this threshold, we’ll make a decision whether or not to significantly improve it or just get rid of it.”
You can probably rely on Google to ‘collect some metrics and say, well, everything that’s below this threshold, we’ll “…(insert punishment spread out over time).
Google probably has a quality score of some sort, and your site probably has a rating whatever that is relevant to (and if you get any real traffic from Google, often a manual rating).
If you have a big site, certain parts of your site will be rated more useful than others to Google.
Improving the quality of your content certainly works to improve traffic, as does intelligently managing your content across the site. Positive results from this process are NOT going to happen overnight. I’ve blogged about this sort of thing for many years, now.
Google are going together better at rating sites that meet their guidelines for ‘quality’ and ‘user satisfaction’ here – I am putting such things in quotes here to highlight the slightly Orwellian doublespeak we have to work with.
Google is policing their SERPs.
Put simply Google’s views on ‘site quality’ and ‘user satisfaction’ do NOT automatically correlate to you getting more traffic.
This endeavor is supposed to be a benchmark – a baseline to start from (when it comes to keywords with financial value).
Everybody, in time, is supposed to hit this baseline to expect to have a chance to rank – and for the short, to medium term, this is where the opportunity for those who take it can be found.
If you don’t do it, someone else will, and Google will rank them, in time, above you.
Google has many human quality raters rating your offering, as well as algorithms targeting old style SEO techniques and engineers specifically looking for sites that do not meet technical guidelines.
Does Google Promote A Site Or Demote Others In Rankings?
In the video above you hear from at least one spam fighter that would confirm that at least some people are employed at Google to demote sites that fail to meet policy:
QUOTE: “I didn’t SEO at all, when I was at Google. I wasn’t trying to make a site much better but i was trying to find sites that were not ‘implementing Google policies'(?*) and not giving the best user experience.” (I can’t quite make out what he says there)
Link algorithms seem particularly aggressive, too, and ‘delayed’ now more than ever (for normal businesses built on old-school links) so businesses are getting it from multiple angles as Google rates the quality of its primary index (which may well sit on top of the old stuff and heavily influenced by ‘quality’ signals on your site).
The ‘Quality Metric’
QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant this is basically the definition of a content form in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics. SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story – Hobo Google Ranking Signals 2017)
Google is on record as saying they had a quality problem that they started fixing with Panda.
That means that many of the sites that were getting a lot of traffic from Google in 2011 were not going to be rated ‘high quality’ as the new quality rating was designed to demote these sites.
In the following video, a Google spokesperson even goes into what they call a ‘quality metric’ signals separated from relevance signals:
These sites used common seo practices to rank, and Google made those practices toxic.
The fun started when these algorithms started to roll out more significantly to the real business world, and continue today with the ever present (often erroneously confirmed) Google ‘quality’ updates we now constantly hear about in the media.
One thing is for sure, you need to get low-quality content OFF your site and get some UNIQUE content on it to maximise the benefit a few months down the line from *any* other marketing activity (like link building).
From my testing, the performance of pages degrade in time but can come back with regular updates and refresh of content in a sensible manner.
Retargeting keywords can also have a positive impact over time. One marked observation between my top performing pages on my test site is that pages that are updated more often (and improved substantially) tend to get more traffic than pages that do not.
A sensible strategy in 2018 is to:
- Get rid of all substandard pages on the site
- Meet Googles technical recommendations
- Meet rater guidelines high priority issues for low to medium ratings
- Go in-depth on your topic in certain areas
- Consolidate content and link equity where prudent with redirects (not canonicals – use these only as a patch for a few months)
- Start producing some sort of actual unique content
- Market that new content more appropriately going forward than in the past
Most of this stuff I cover in my seo audits.
What Does Google Classify As Doorway Pages?
Google classes many types of pages as doorway pages. Doorway pages can be thought of as a navigation structure and content on a site that is more designed for Googe to navigate and index instead of being built for a user to navigate to from your own site. That’s my take on it, in simple language.
Over the years, I’ve played about a LOT with this kind of multiple location-based ranking and getting websites to rank with location pages. That is – how to optimise location based pages on a site not to look like doorway pages. If you need specific guidance, see my search engine optimisation consultancy service.
Its actually a very interesting aspect of modern SEO and one that is constantly shifting. In the recent past, location-based SERPs were often lower-quality, and so Google historically sometimes ranked location-based doorway pages in such circumstances, giving you as an SEO a chance to rank pages.
So there is sometimes a grey area in certain instances – and still is – for real businesses who THINK they SHOULD rank for specific locations where they are not geographically based.
Ultimately, the quality of your ‘
doorway location pages’ should be high, with ‘value-add’ in 2018, and that is often difficult to achieve for the long term with pages just designed around a basic list of regions. You should have a good reason to rank for these location keyword terms.
What Google Says About Doorway Pages
Google said a few years ago:
QUOTE: “For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn’t like it, and then tries the next result in the search results page and is taken to that same site that they didn’t like, that’s a really frustrating experience.”
A question about using content spread across multiple pages and targeting different geographic locations on the same site was asked in the recent Hangout with Google’s John Meuller
QUOTE: “We are a health services comparison website…… so you can imagine that for the majority of those pages the content that will be presented in terms of the clinics that will be listed looking fairly similar right and the same I think holds true if you look at it from the location …… we’re conscious that this causes some kind of content duplication so the question is is this type … to worry about? “
Bearing in mind that (while it is not the optimal use of pages) Google does not ‘penalise’ a website for duplicating content across internal pages in a non-malicious way, John’s clarification of location-based pages on a site targeting different regions is worth noting:
QUOTE: “For the mostpart it should be fine I think the the tricky part that you need to be careful about is more around doorway pages in the sense that if all of these pages end up with the same business then that can look a lot like a doorway page but like just focusing on the content duplication part that’s something that for the most part is fine what will happen there is will index all of these pages separately because from from a kind of holistic point of view these pages are unique they have unique content on them they might have like chunks of text on them which are duplicated but on their own these pages are unique so we’ll index them separately and in the search results when someone is searching for something generic and we don’t know which of these pages are the best ones we’ll pick one of these pages and show that to the user and filter out the other variations of that that page so for example if someone in Ireland is just looking for dental bridges and you have a bunch of different pages for different kind of clinics that offer the service and probably will pick one of those pages and show those in the search results and filter out the other ones.
But essentially the idea there is that this is a good representative of the the content from your website and that’s all that we would show to users on the other hand if someone is specifically looking for let’s say dental bridges in Dublin then we’d be able to show the appropriate clinic that you have on your website that matches that a little bit better so we’d know dental bridges is something that you have a lot on your website and Dublin is something that’s unique to this specific page so we’d be able to pull that out and to show that to the user like that so from a pure content duplication point of view that’s not really something I totally worry about.
I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.”
The takeaway here is that if you have LOTS of location pages serving ONE SINGLE business in one location, then those are very probably classed as some sort of doorway pages, and probably old-school SEO techniques for these type of pages will see them classed as lower-quality – or even – spammy pages.
CAVEAT – Google has harped on about Doorway pages for a long time. It’s common to hear something about doorway pages. But people still make them, because, either:
- their business model depends on it for lead generation
- the alternative is either a lot of work or
- they are not creative enough or
- they are not experienced enough to avoid the pitfalls of having lower-quality doorway pages on a site or
- they are experienced enough to understand what impact they might be having on a site quality score
Google also has a doorway page algorithm. They said (back in 2015):
QUOTE: “Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.” Google
If you have location pages that serve multiple locations or businesses, then those are not doorway pages and should be improved uniquely to rank better, according to John’s advice.
Are You Making Doorway Pages?
Search Engine Land offered us this from Google:
QUOTE: “How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions:
- Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
- Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
- Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
- Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
- Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?”
Barry Schwartz at Seroundtable picked up:
QUOTE: “Well, a doorway page would be if you have a large collection of pages where you’re just like tweaking the keywords on those pages for that.
I think if you focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword then that’s that’s usually something that leads to a reasonable result.
Whereas if you’re just taking a list of keywords and saying I need to make pages for each of these keywords and each of the permutations that might be for like two or three of those keywords then that’s just creating pages for the sake of keywords which is essentially what we look at as a doorway.”
Note I underlined the following statement:
QUOTE: “focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword.”
That is because sometimes, often, in fact, there is an alternative to doorway pages for location pages that achieve essentially the same thing.
There is confusion, with good reason. People want to rank for stuff with their website, and Google doesn’t really want people to rank for stuff that is done en-masse, especially focused on producing A LOT of pages from a list of keyword variations. People create doorway pages without even properly understanding what they represent to Google.
WIKIPEDIA says of doorway pages:
QUOTE: “Doorway pages are web pages that are created for spamdexing. This is for spamming the index of a search engine by inserting results for particular phrases with the purpose of sending visitors to a different page.”
“Spamdexing, which is a word derived from “spam” and “indexing,” refers to the practice of search engine spamming. It is a form of SEO spamming.”
“Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
Here are some examples of doorways:
- Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
- Pages generated to funnel visitors into the actual usable or relevant portion of your site(s)
- Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy”
“Doorways are sites or pages created to rank highly for specific search queries”
Well, every optimised page on a website falls under that first statement, although it is somewhat very much clarified in bullets some of the pages Google talks about e.g. location pages.
It is not just location pages that are classed as doorway pages:
QUOTE: “For Google, that’s probably overdoing it and ends up in a situation you basically create a doorway site …. with pages of low value…. that target one specific query.” John Mueller 2018
One could surmise the more doorway page-type pages on your site, the more Google is annoyed if the resulting pages are indeed built like this. If Google is annoyed, then any quality score(s) they have for your site (rankings for your main keywords, for instance, or a number of indexed location pages) may very well register it negatively, IF your pages are lower quality doorways e.g. users don’t like them or they are made using old SEO-techniques (more and more labelled as spam in 2018).
If you are making keyword rich location pages for a single business website, there’s a risk these pages will be classed doorway pages in 2018.If you definitely have VERY low-quality doorway pages on your site, you should remove them or rethink your strategy about this aspect of ranking in Google for the long term. Location-based pages are suitable for some kind of websites, and not others.
Conversion Rate Optimisation in All Areas
However you are trying to satisfy users, many think this is about terminating searches via your site or on your site or satisfying the long-click.
How you do that in an ethical manner (e.g. not breaking the back button on browsers) the main aim is to satisfy that user somehow.
You used to rank by being a virtual PageRank black hole. Now, you need to think about being a User black hole.
You want a user to click your result in Google, and not need to go back to Google to do the same search that ends with the user pogo-sticking to another result, apparently unsatisfied with your page.
The aim is to convert users into subscribers, returning visitors, sharing partners, paying customers or even just help them along in their way to learn something.
The success I have had in ranking pages and getting more traffic have largely revolved around optimising the technical framework of a site, crawl and indexing efficiency, removal of outdated content, content re-shaping, constant improvement of text content to meet its purpose better, internal links to relevant content, conversion optimisation or getting users to ‘stick around’ – or at least visit where I recommend they visit.
Mostly – I’ve focused on satisfying user intent because Google isn’t going back with that.
You don’t need only to stick to one topic area on a website. That is a myth.
If you create high-quality pieces of informative content on your website page-to-page, you will rank.
The problem is – not many people are polymaths – and this will be reflected in blog posts that end up too thin to satisfy users and in time, Google, or e-commerce sites that sell everything and have specialty and experience in little of it.
The only focus with any certainty in 2018 is whatever you do, stay high-quality with content, and avoid creating doorway pages.
For some sites, that will mean reducing pages on many topics to a few that can be focused on so that you can start to build authority in that subject area.
Your website is an entity. You are an entity. Explore concepts. Don’t repeat stuff. Be succinct.
You are what keywords are on your pages.
You rank as a result of others rating your writing.
Avoid toxic visitors. A page must meet its purpose well, without manipulation. Do people stay and interact with your page or do they go back to Google and click on other results? A page should be explicit in its purpose and focus on the user.
The number 1 ‘user experience’ signal you can manipulate with low risk is improving content until it is more useful or better presented than is found on competing pages for variously related keyword phrases.