If you have an optimised platform on which to publish it, high-quality content is the number 1 user experience area to focus on across websites to earn traffic from Google in 2016.
Content quality is also the number one area to focus on if you are to avoid Google Panda demotion. Panda, we have been told, is part of Google’s core site quality rating algorithms in 2016.
Whatever that means, of course.
This article aims to cover the most significant challenges of writing ‘seo-friendly’ text and web page copy in 2016.
I discuss what’s in play, regarding what types of pages Google is ranking, and the types of activity around this content that helps it rank in the first place.
Table Of Contents
Focus On The User
It is time to focus on the user when it comes to content marketing, and the bottom line is you need to publish unique content free from any low-quality signals.
Those in every organisation with the responsibility of adding content to a website should understand these fundamental aspects about satisfying web content because the door is firmly closing on unsatisfying web content.
Low-quality content can severely impact the success of seo, in 2016.
SEO copywriting is a bit of a dirty word – but text on a page still requires optimised, using familiar methods, albeit published in a markedly different way than we used to get away with.
Optimise For User Intent & Satisfaction
When it comes to writing SEO-friendly text for Google, we must optimise for user intent, not simply what a user typed into Google.
Google will send people looking for information on a topic to the highest quality, relevant pages it has in its database, often BEFORE it relies on how Google ‘used‘ to work e.g. relying on finding near or exact match instances of a keyword phrase on any one page.
Google is constantly evolving to better understand the context and intent of user behaviour, and it doesn’t mind rewriting the query used to serve high-quality pages to users that comprehensively deliver on user satisfaction e.g. explore topics and concepts in a unique and satisfying way.
Things, Not Strings
Google is better at working out what a page is about, and what it should be about to satisfy the intent of a searcher, and it isn’t relying only on keyword phrases on a page to do that anymore.
Google has a Knowledge Graph populated with NAMED ENTITIES and in certain circumstances, Google relies on such information to create SERPs (Search Engine Results Pages)..
Google has plenty of options when rewriting the query in a contextual way, based on what you searched for previously, who you are, how you searched and where you are at the time of the search.
Google Is Not Going To Rank Low-Quality Pages When It Has Better Options
I was working this, long before I understood it partially enough to write anything about it.
Here is a few examples of taking a standard page that did not rank for years and then turning it into a topic oriented resource page designed around a user’s intent:
Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.
Google Is Evolving and Content Marketing With It
Google does not work only the way it used to work, and as a result, this impacts a lot of websites built a certain way to rank high in Google – and Google is a lot less forgiving these days.
Is the user going to be more satisfied with an exact match query on a low-quality website, OR a high-quality page closely related to the search query used, published by an entity Google trusts?
Google is deciding more and more to go with the latter.
Optimisation, in 2016, must not get in the way of the text or user experience.
Do not optimise for irrelevant keyword searches.
Do not keyword stuff.
The Importance of Unique Content
From a quality page point of view, duplicate content is a low-quality indicator.
Boilerplate text is another low-quality indicator.
If your website is tarnished with these practices – it is going to be classed ‘low-quality’ by some part of the Google algorithm:
- If all you have on your page are indicators of low-quality – you have a low-quality page in 2016 – full stop.
- If your entire website is made up of pages like that, you have a low-quality website.
- If you have manipulative backlinks, then that’s a recipe for disaster.
Long Tail Traffic Is Hiding In Long Form Content
Google didn’t kill the long tail.
In part, they shifted a lot of long tail visitors to pages that Google thought may satisfy their user query, RATHER than just rely on particular exact and near match keywords repeated on a particular page.
At the same time, Google was hitting old school SEO tactics and particularly thin or overlapping pages. So – an obvious strategy and one I took was to identify the thin content on a site and merge it into long form content and then rework that to bring it all up-to-date.
Longform content is a magnet for long tail searches. The more searchers and visitors you attract, the more you can ‘satisfy’ and the better chance you can rank higher in the long run.
Do you NEED long-form pages to rank?
No – but it can be very useful as a base to start a content marketing strategy if you are looking to pick up links and social media shares.
Google Algorithms Target Low-Quality Content
Optimising (without improving) low-quality content springs traps set by ever-improving core quality algorithms.
What this means is that ‘optimising’ low-quality pages is very much swimming upstream in 2016.
Optimising low-quality pages without value-add is self-defeating, now that the algorithms – and manual quality rating efforts – have got that stuff nailed down.
If you optimise low-quality pages using old school SEO techniques, you will be hit with a low-quality algorithm (like the Quality Update or Google Panda).
You must avoid boilerplate text, spun text or duplicate content when creating pages – or you are Panda Bamboo – as Google hints at in the 2015 Quality Rater’s Guide.
“6.1 Low Quality Main Content One of the most important considerations in PQ rating is the quality of the MC. The quality of the MC is determined by how much time, effort, expertise, and talent/skill have gone into the creation of the page. Consider this example: Most students have to write papers for high school or college. Many students take shortcuts to save time and effort by doing one or more of the following:
- Buying papers online or getting someone else to write for them
- Making things up.
- Writing quickly with no drafts or editing.
- Filling the report with large pictures or other distracting content.
- Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there.
- Using commonly known facts, for example, “Argentina is a country. People live in Argentina. Argentina has borders. Some people like Argentina.”
- Using a lot of words to communicate only basic ideas or facts, for example, “Pandas eat bamboo. Pandas eat a lot of bamboo. It’s the best food for a Panda bear.”
Unfortunately, the content of some webpages is similarly created. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well. Important: Low quality MC is a sufficient reason to give a page a Low quality rating.”
Google rewards uniqueness or punishes the lack of it.
The number 1 way to do ‘SEO copywriting‘ in 2016 will be to edit the actual page copy to continually improve upon its accuracy, uniqueness, relevance, succinctness and use.
Low-Quality Content Is Not Meant To Rank High in Google.
A Google spokesmen said not that long ago that Google Panda was about preventing types of sites that shouldn’t rank for particular keywords from ranking for them.
When Google demotes your page for duplicate content practices, and there’s nothing left in the way of unique content to continue ranking you for – your web pages will mostly be ignored by Google.
The way I look at it – once Google strips away all the stuff it ignores (duplicate text) – what’s left? In effect, that’s what you can expect Google to reward you for. If what is left is boilerplate synonymised text content – that’s now being classed as web spam – or ‘spinning’.
NOTE – The ratio of duplicate content on any page is going to hurt you if you have more duplicate text than original. A simple check of the pages, page to page, on the site is all that’s needed to ensure each page is DIFFERENT (regarding text) page-to-page.
If you have large sections of duplicate text page-to-page – that is a problem that should be targeted and removed.
It is important to note:
- The main text content on the page must be unique to avoid Google’s page quality algorithms.
- Verbose text must NOT be created or spun automatically
- Text should NOT be optimised to a template as this just creates a footprint across many pages that can be interpreted as redundant or manipulative boilerplate text.
- Text should be HIGHLY descriptive, unique and concise
- If you have a lot of pages to address, the main priority is to create a UNIQUE couple of paragraphs of text, at least, for the MC (Main Content). Pages do not need thousands of words to rank. They just need to MEET A SPECIFIC USER INTENT and not TRIP ‘LOW_QUALTY’ FILTERS. A page with just a few sentences of unique text still meets this requirement (150-300 words) – for now.
- When it comes to outcompeting competitor pages, you are going to have to look at what the top competing page is doing when it comes to main content text. Chances are – they have some unique text on the page. If they rank with duplicate text, either their SUPPLEMENTARY CONTENT is better, or the competitor domain has more RANKING ABILITY because of either GOOD BACKLINKS or BETTER USER EXPERIENCE.
- Updating content across a site should be a priority as Google rewards fresher content for certain searches.
Are Keywords – and Keyword Research – Dead?
No. Not while we use keywords to communicate with visitors. At this end of the user journey, we still need to build keyword ‘rich’ pages around topic and concepts. Instead of repeating keywords like we used to do to a ‘formula’, we now need to find keywords that make this page relevant for as many searches around a concept as possible.
You still need keywords to build a concept with, it is now more important to include terms with relationships to the main topic. In short – when someone lands on your page about a topic – are they satisfied with the scope and breadth of the information presented? If users are satisfied with a page regarding how it performs against competing pages, then Google will probably be satisfied.
Keyword research is still going to be important in optimising high-quality pages or planning a content marketing strategy.
Editorial & Informative In-depth Content
From a strategic point of view, if you can explore a topic or concept in an in-depth way you must do it before your competition. Especially if this is one of the only areas you can compete with them on.
Here are some things to remember about creating topic oriented in-depth content:
- In-depth content needs to be kept updated. Every six months, at least. If you can update it a lot more often than that – it should be updated more
- In-depth content can reach tens of thousands of words, but the aim should always be to make the page concise as possible, over time
- In-depth content can be ‘optimised’ in much the same way as content has always been optimised
- In-depth content can give you authority in your topical niche
- Pages must MEET THEIR PURPOSE WITHOUT DISTRACTING ADS OR CALL TO ACTIONS. If you are competing with an information page – put the information FRONT AND CENTRE. Yes – this impacts negatively on conversions in the short term. BUT – these are the pages Google will rank high. That is – pages that help users first and foremost complete WHY they are on the page (what you want them to do once you get them there needs to be of secondary consideration when it comes to Google organic traffic).
- You need to balance conversions with user satisfaction unless you don’t want to rank high in Google in 2016.
Here is the bad news about ‘quality content’. Even it, including in-depth articles, entropies.
This is, however, natural.
You cannot expect content to stay relevant and perform at the same levels six months or a year down the line.
Even high-quality content can lose positions to better sites and better, more up-to-date content.
In any competitive niche, you are going to be up against this kind of content warfare – and this is only going to get more competitive.
In 2016 – you must be investing in your website content. You must create and keep updated unique, informative, trustworthy and authoritative editorial content.
Once you have keywords in place, you must focus on improving user experience and conversion optimisation.
How Much Text Do I Need To Write For Google?
How much text do you put on a page to rank for a certain keyword?
Well, as in so much of SEO theory and strategy, there is no optimal amount of text per page, and it is going ot differ, based on the topic, and content type, and SERP you are competing in.
Instead of thinking about the quantity of the Main Content (MC) text, you should think more about the quality of the content on the page. Optimise this with searcher intent in mind.
There is no minimum amount of words or text to rank in Google. I have seen pages with 50 words out-rank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other ‘strategy’. Google is a lot better at hiding away those pages in 2016, though.
At the moment, I prefer long-form pages and a lot of text, still focused on a few related keywords and keyphrases to a page. Useful for long tail key phrases and easier to explore related terms.
Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a user’s search query.
I don’t care how many words I achieve this with and often I need to experiment on a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google.
One thing to note – the more text you add to the page, as long as it is unique, keyword rich and relevant to the topic, the more that page will be rewarded with more visitors from Google.
There is no optimal number of words on a page for placement in Google.
Every website – every page – is different from what I can see. Don’t worry too much about word count if your content is original and informative. Google will probably reward you on some level – at some point – if there is lots of unique text on all your pages.
John Mueller of Google said there is no minimum word count when it comes to gauging content quality.
There’s no minimum length, and there’s no minimum number of articles a day that you have to post, nor even a minimum number of pages on a website. In most cases, quality is better than quantity. Our algorithms explicitly try to find and recommend websites that provide content that’s of high quality, unique, and compelling to users. Don’t fill your site with low-quality content, instead work on making sure that your site is the absolute best of its kind. JOHN MUELLER
However, the quality rater’s guide does state:
6.2 Unsatisfying Amount of Main Content
Some Low quality pages are unsatisfying because they have a small amount of MC for the purpose of the page. For example, imagine an encyclopedia article with just a few paragraphs on a very broad topic such as World War II. Important: An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating.
How Many Words Can I Optimise A Page For?
This is going to depend on too many things to give you a general number.
You should probably not think like that, too much, though. A page should rank for the Head Terms (which are in the Title Tag) and the page copy itself should be optimised for as many other closely related keyword phrases as possible (all part of the natural long-tail of the Head Term.).
How Often Should Important Keywords Appear On Your Page?
You only need to mention related terms, and sometimes even Head Terms, ONCE in page content to have an impact on relevance and rank for lots of different long-tail key phrases. I still repeat phrases a few times on any page.
A perfect keyword density is a myth, but keyword stuffing is not.
Keyword Stuffing is clearly against Google guidelines – so I avoid it in 2016.
Instead of thinking how many times to repeat keyword phrases, I’ve had success thinking the opposite over the last few years. That is, how few times can I repeat a primary key phrase on a page to rank it in Google.
It’s not how often to repeat keyword phrases; it is where they are placed throughout the document, and how prominent they are.
What Is Keyword Stuffing?
Keyword stuffing is simply the process of repeating the same keyword or key phrases over and over in a page. It’s counter productive. It’s is a signpost of a very low-quality spam site and is something Google clearly recommends you avoid.
“the practice of loading a webpage with keywords in an attempt to manipulate a site’s ranking in Google’s search results”.
Keyword stuffing text makes your copy often unreadable and so, a bad user experience. It often gets a page booted out of Google but it depends on the intent and the trust and authority of a site. It’s sloppy SEO in 2016.
It is obviously not a tactic you want to employ in search of long term rankings.
Just because someone else is successfully doing it do not automatically think you will get away with it.
Don’t do it – there’s better ways of ranking in Google without resorting to it.
I used to rank high for ‘keyword stuffing‘ years ago, albeit with a keyword stuffed page that I created to illustrate you could, at one time, blatantly ignore Google’s rules and, in fact, use them as a blueprint to spam Google.
I don’t do that anymore – Google is a little less forgiving that it used to be.
Good Content Still Needs Optimised
The issue is, original “compelling content” – so easy to create isn’t it(!) – on a site with no links and no audience and no online business authority is as useful as boring, useless content – to Google – and will be treated as such by Google – except for long tail terms (if even).
It usually won’t be FOUND by many people and won’t be READ and won’t be ACTED upon – not without a few good links pointing to the site – NOT if there is any competition for the term.
Generalisations make for excellent link bait and while good, rich content is very important, sayings like ‘content is everything’ is not telling you the whole story.
The fact is – every single site is different, sits in a niche with a different level of competition for every keyword or traffic stream, and needs a strategy to tackle this.
There’s no one size fits all magic button to press to get traffic to a site. Some folk have a lot of domain authority, some know the right people, or have access to an audience already – indeed, all they might need is a copywriter – or indeed, some inspiration for a blog post.
They, however, are in the minority of sites.
Most of the clients I work with have nothing to start with and are in a relatively ‘boring’ niche few reputable blogs write about.
In one respect, Google doesn’t even CARE what content you have on your site (although it’s better these days at hiding this).
Humans do care, of course, so at some point, you will need to produce that content on your pages.
You Can ALWAYS Optimise Content To Perform Better In Google
An SEO can always get more out of content than any copywriter, but there’s not much more powerful that a copywriter who can lightly optimise a page around a topic.
If I wanted to rank for “How To Write For Google“? – for instance – in the old days you used to put the key phrase in the normal elements like the Page Title Element and ALT text and then keyword stuffed your text to make sure you repeated “How To Write For Google” enough times in a block of low-quality text.
Using variants and synonyms of this phrase helped to add to the ‘uniqueness’ of the page, of course.
Throwing in any old text would beef the word count up.
Now, in 2016, if I want to rank high in Google for that kind of term – I would still rely on old SEO best practices like a very focused page title – but now the text should explore a topic in a much more informative way.
Writing for Google and meeting the query intent means an SEO copywriter would need to make sure page text included ENTITIES AND CONCEPTS related to the MAIN TOPIC of the page you are writing about and the key phrase you are talking about.
If I wanted a page to rank for this term, I would probably need to explore concepts like Google Hummingbird, Query Substitution, Query Reformation and Semantic Search i.e.,. I need to explore a topic or concept in fully – and as time goes on – more succinctly – than competing pages.
If you want to rank for a SPECIFIC search term – you can still do it using the same old, well-practiced keyword targeting practices. The main page content itself just need to be high-quality enough to satisfy Google’s quality algorithms in the first place.
This is still a land grab.
Can I Just Write Naturally and Rank High in Google?
Yes, you must write naturally (and succinctly) in 2016, but if you have no idea the keywords you are targeting, and no expertise in the topic, you will be left behind those that can access this experience.
You can just ‘write naturally’ and still rank, albeit for fewer keywords than you would have if you optimised the page.
There are too many competing pages targeting the top spots not to optimise your content.
Naturally, how much text you need to write, how much you need to work into it, and where you ultimately rank, is going to depend on the domain reputation of the site you are publishing the article on.
Do You Need Lots of Text To Rank Pages In Google?
User search intent is a way marketers describe what a user wants to accomplish when they perform a Google search.
SEOs have understood user search intent to fall broadly into the following categories and there is an excellent post on Moz about this.
- Transactional – The user wants to do something like buy, signup, register to complete a task they have in mind.
- Informational – The user wishes to learn something
- Navigational – The user knows where they are going
The Google human quality rater guidelines modify these to simpler constructs:
SO – how do you optimise for all this?
You could rely on old-school seo techniques, but Google doesn’t like thin pages in 2016, and you need higher quality unnatural links to power low-quality sites these days. That is all a risky investment.
Google has successfully made that way forward a mine-field for smaller businesses in 2016.
A safer route with a guaranteed ROI, for a real business who can’t risk spamming Google, is to focus on satisfying user satisfaction signals Google might be rating favourably.
How a user searches is going to be numerous and ambiguous and this is an advantage for the page that balances this out better than competing pages in SERPs.
High-quality copywriting is not an easy ‘ask’ for every business, but it is a tremendous leveller.
Anyone can teach what they know and put it on a website if the will is there.
Some understand the ranking benefits of in-depth, curated content, for instance, that helps a user learn something. In-depth pages, or long form content is a magnet for longtail keyphrases.
High-quality text content of any nature is going to do well, in time, and copywritersshould rejoice.
Copywriting has never been more important.
Offering high-quality content is a great place to start on your site.
It’s easy for Google to analyse and rate, and it is also a sufficient barrier to entry for most competitors (at least, it was in the last few years).
Google is looking for high-quality content:
“High quality pages and websites need enough expertise to beauthoritative and trustworthy on their topic.”
..or if you want it another way, Google’s algorithms target low-quality content.
But what if you can’t write to satisfy these KNOW satisfaction metrics?
Luckily – you do not need lots of text to rank in Google.
When a user is actively seeking your page out and selects your page in the SERP, they are probably training Google AI to understand this is a page on a site that satisfies the user intent. This user behaviour is where traditional media and social media promotion is going to be valuable if you can get people to search your site out. This is one reason you should have a short, memorable domain if you can get one.
So, users should be using Google to seek your site out.
‘Do’ Beats ‘Know.’
If you can’t display E.A.T. in your writing, you can still rank if you satisfy users who do search that query.
Last year I observed Google rank a page with 50 words of text on it instead of a page with 5000 words and lots of unique images that target the same term on the same domain.
While there might be something at fault with the ‘optimised’ 5000-word page I have overlooked, the main difference between the two pages was time spent on the page and task completion ‘rate’.
I’ve witnessed Google flip pages on the same domain for many reasons, But it did get me thinking perhaps that Google is thinking users are more satisfied with the DO page (an online tool) with better task completion metrics than the KNOW page (a traditional informational page).
In the end, I don’t need to know why Google is flipping the page, just that it is.
So that means that you don’t always need ‘text heavy content’ to rank for a term.
You never have of course.
I only offer one example I’ve witnessed Google picking the DO page over the KNOW page, and it surprised me when it did.
It has evidently surprised others too.
There is a recent post on Searchmetrics that touches on pages with only a little text-content ranking high in Google:
From a classical SEO perspective, these rankings can hardly be explained. There is only one possible explanation: user intent. If someone is searching for “how to write a sentence” and finds a game such as this, then the user intention is fulfilled. Also the type of content (interactive game) has a well above average time-on-site.
That’s exactly what I think I have observed, too, though I wouldn’t ever say there is only ‘one possible explanation‘ to anything to do with Google.
For instance – perhaps other pages on the site help the page with no content rank, but when it comes to users being satisfied, Google shows the page with better usage statistics instead, because it thinks it is a win for everyone involved.
This is speculation, of course, and I have witnessed Google flipping pages in SERPs if they have a problem with on of them, for instance, for years.
This is news in January 2016 but I saw it last year and some time ago now. It isn’t entirely ‘new’ to the wild, but might be more noticeable in more niches in 2016.
How Much Text Do You Need To Rank?
None, evidently, if you can satisfy the query in an unusual manner without the text.
Are Poor Spelling and Bad Grammar Google Ranking Factors?
Is Grammar A Ranking Factor?
I’ve been blogging for nine years and most complaints I’ve had in that time have been about my poor grammar and spelling in my posts.
My spelling and grammar may be atrocious but these shortcomings haven’t stopped me ranking lots of pages over the years.
Google historically has looked for ‘exact match’ instances of keyword phrases on documents and SEO have, historically, been able to optimise successfully for these keyword phrases – whether they are grammatically correct or not.
So how could bad grammar carry negative weight in Google’s algorithms?
Advice From Google
The only video I could find a Google spokesman talking about inadequate grammar as a ranking factor or page quality signal was from a few years ago.
In this video, we are told, by Google, that grammar is NOT a ranking factor.
Not, at least, one of the 200+ quality signals Google uses to rank pages.
And that rings true, I think.
Google’s Matt Cutts did say though:
“It would be fair to use it as a signal…The more reputable pages do tend to have better grammar and better spelling. “
Google Panda & Content Quality
Google is on record as saying (metaphorically speaking) their algorithms are looking for signals of low quality when it comes to rating pages on Content Quality.
Some possible examples could include:
“1. Does this article have spelling, stylistic, or factual errors?”
“2. Was the article edited well, or does it appear sloppy or hastily produced?”
“3. Are the pages produced with great care and attention to detail vs. less attention to detail?”
“4. Would you expect to see this article in a printed magazine, encyclopedia or book?”
Altogether – Google is rating content on overall user experience as it defines and rates it, and bad grammar and spelling equal a poor user experience.
At least on some occasions.
Google aims to ensure organic search engine marketing be a significant investment in time and budget for businesses. Critics will say to make Adwords a more attractive proposition.
Google aim to reward quality signals that:
- take time to build and
- the vast majority of sites will not, or cannot meet without a lot of investment.
NO website in a competitive market gets valuable traffic from Google in 2016 without a lot of work. Technical work and content curation.
It’s an interesting aside.
Fixing the grammar and spelling on a page can be a time-consuming process.
It’s clearly a machine-readable and detectable – although potentially noisy – signal.
And Google IS banging on about Primary Content Quality and User Experience.
Who knows? This ranking factor could be one for the future.
Is Spelling A Google Ranking Factor?
Poor spelling has always had the potential to be a NEGATIVE ranking factor in Google. IF the word that is incorrect on the page is unique on the page and of critical importance to the search query.
Although – back in the day – if you wanted to rank for misspellings – you optimised for them – so – poor spelling would be a POSITIVE ranking looking back not that long ago.
Now, that kind of optimisation effort is fruitless, with changes to how Google presents these results in 2016.
Google will favour “Showing results for” results over presenting SERPs based on a common spelling error.
Testing to see if bad spelling is a ranking factor is still easy on a granular level, bad grammar is not so easy to test.
I think Google has clearer signals to play with than spelling and grammar. It’s not likely to penalise you for the honest mistakes most pages exhibit, especially if you have met more important quality signals.
And I’ve seen clear evidence of pages ranking very well with both bad spelling AND bad grammar. My own!
How Do You Improve Web Content in 2016?
This is no longer about repeating keywords. ANYTHING you do to improve the page is going to be a potential SEO benefit. That could be:
- Fixing poor grammar and spelling mistakes
- Adding synonyms to text
- Reducing keyword stuffing
- Reducing the ratio of duplicated text on your page to unique text
- Removing old outdated links or out-of-date content
- Re-Wording sentences to take out sales or marketing fluff and focusing more on the USER INTENT (e.g. give them the facts first including pros and cons – for instance – through reviews) and purpose of the page.
- Merging many old pages into one, fresh page
- Conciseness, while still maximising relevance and keyword coverage
- Keyword phrase prominence throughout the copy (you can have too much, or too little, and it is going to take testing to find out what is the optimal presentation will be)
- Topic modelling
A great writer can get away with fluff but the rest of us probably should focus on being concise.
Low-quality fluff is easily discounted by Google these days – and can leave a toxic footprint across a website.
Optimising For The New Opportunity
Any content strategy in 2016 should naturally be focused on creating high-quality content and also revolve around triggering Google ENHANCED SNIPPETS that trigger when Google wants them to – and intermittently – depending on the nature of the query.
Regarding the above image, where a page on Hobo is promoted to number 1 – I used traditional competitor keyword research and old-school keyword analysis and keyword phrase selection, albeit focused on the opportunity in long-form content, to accomplish that, proving that you still use this keyword research experience to rank a page in 2016.
Despite all the obfuscation, time delay, keyword rewriting, manual rating and selection bias Google goes through to match pages to keyword queries to, you still need to optimise a page to rank in a niche, and if you do it sensibly, you unlock a wealth of long-tail traffic over time (a lot of which is useless as it always was, but what RankBrain might clean up given time).
- Google is only going to produce more of these direct answers or answer boxes in future (they have been moving in this direction since 2005).
- Focusing on triggering these will focus your content creators on creating exactly the type of pages Google wants to rank. “HOW TO” guides and “WHAT IS” guides is IDEAL and the VERY BEST type of content for this exercise.
- In 2016 Google is REALLY rewarding these articles – and it is VERY probably going to keep doing so for the future.
- Google Knowledge Graph offers another exciting opportunity – and indicates the next stage in organic search.
- Google is producing these ANSWER BOXES that can promote a page from anywhere on the front page of Google to number 1.
- All in-depth content strategy on your site should be focused around this new aspect of Google Optimisation. The bonus is you physically create content that Google is ranking very well in 2016 even without taking knowledge boxes into consideration.
- Basically – you are feeding Google EASY ANSWERS to scrape from your page. This all ties together very nicely with organic link building. The MORE ANSWER BOXES you UNLOCK – the more chance you have of ranking number one FOR MORE AND MORE TERMS – and as a result – more and more people see your utilitarian content and as a result – you get social shares and links if people care at all about it.
- You can share an Enhanced Snippet (or Google Answer Box as they were first called by SEOs). Sometimes you are featured and sometimes it is a competitor URL. All you can do in this case is to continue to improve the page until you squeeze your competitor out.
We already know that Google likes ‘tips’ and “how to” and expanded FAQ but this Knowledge Graph ANSWER BOX system provides a real opportunity and is CERTAINLY what any content strategy should be focused around to maximise exposure of your business in organic searches.
Unfortunately, this is a double-edged sword if you take a long-term view. Google is, after all, looking for easy answers so, eventually, it might not need to send visitors to your page.To be fair, these Google Enhanced Snippets, at the moment, complete with a reference link to your page and do positively impact traffic to the page. SO – for the moment –
To be fair, these Google Enhanced Snippets, at the moment, appear complete with a reference link to your page and can positively impact traffic to the page. SO – for the moment – it’s an opportunity to take advantage of.
How Long Does It Take To Write Seo-Friendly Copy?
Embarrassingly long, in some cases. I am not a professionally trained copywriter, though. I know how to rank pages, and the type of content it takes. The truth is in some verticals; you are writing, from scratch for ONE PAGE (as it needs to be unique), the same amount of text that would have optimised an entire 10,000-page website, just a few years ago. Why? Because your competition will do it, and Google will send them traffic.
I do not see many people pointing that out.
Ranking in Google organic listings is an investment in search engine optimisation AND, at least, one other multi-discipline (of which, there are many); creative copywriting and teaching, or design or engineering, for instance. Copywriting, if you are competing in an informational SERP; Teaching, if you are trying to meet Google’s E.A.T. requirements; design and engineering if you are focusing on satisfying specific user intent (e.g. offering tools, instead of content, for users).
These costs are in 2016 are relatively cost-prohibitive for smaller businesses (who can always choose Google Adwords), but Google is evidently giving the individual, any individual, rather than the business, the opportunity to rank in certain niches who has the drive, knowledge and time to invest in some machine detectable time-spent endeavour that satisfies users better than competing pages.
Businesses can still, of course, compete.
Note the importance of expertise in a chosen subject matter:
6.3 Lacking Expertise, Authoritativeness, or Trustworthiness (E-A-T)
Some topics demand expertise for the content to be considered trustworthy. YMYL topics such as medical advice, legal advice, financial advice, etc. should come from authoritative sources in those fields. Even everyday topics, such as recipes and housecleaning, should come from those with experience and everyday expertise in order for the page to be trustworthy. You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.
Content like this will not come cheap in the future.
The first thing to do is always look to your site for content that is under-performing. Low-quality pages should be reworked, redirected or removed. This is normally the first thing we aim to do in any project we are employed in.
Often, content can be repackaged into a more compelling, user-focused topic based article that explores a topic and frequently asked questions about it. This topic page can be optimised with synonyms and related keywords, entities and concepts.
Ranking in Google is not about repeating keywords anymore – in fact – all efforts down that road will only lead to failure.
Ranking high in Google is about exploring a ‘CONCEPT’ on a web page in a way that HELPS INFORM users while still meeting Google’s keyword based and entity based relevancy signals.
******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******
They are not lying. Here is another example of taking multiple pages and making one better page:
Optimising For Topics And Concepts
Old SEO was, to a large extent, about repeating text. New SEO is about user satisfaction.
Google’s punishment algorithms designed to target SEO are all over that practice these days. And over a lot more, to boot, in 2016.
- Google’s looking for original text on a subject matter that explores the concept that the page is about, rather than meets keyword relevance standards of yesteryear.
- If your page rings these Google bells in your favour, Google knows your page is a good resource on anything to do with a particular concept – and will send visitors to it after invisibly rewriting the actual search query that the user made. Google is obfuscating the entire user intent journey.
For us at the receiving end, it all boils down to writing content that meets a specific user intent and do it better than competing pages.
We are not trying to beat Google or RankBrain, just the competition.
Pages looking to, genuinely, help people are a good user experience. At page level, satisfying informational search intent is still going to be about keyword analysis at some level.
SEO is about understanding topics and concepts as search engines try to.
A well-optimised topic/concept oriented page that meets high relevance signals cannot really fail to pick up search traffic and, if it’s useful to people, pick up UX signals that will improve rankings in Google (I include links in that).
What Topic Are You Building Machine Identifiable Expertise in?
People don’t know what to blog about, so often, a blog becomes a jumbled mess of un-directed, thin content. You prevent this from happening going forward by knowing what topic you are building out on your website. If everything you publish is related, you are probably building authority in that niche.
If all you are building is content that is immediately out of date and found on other more authoritative websites, you are toast. Keep that stuff for social media, or minimise its footprint on your blog.
EXAMPLE: You could write 30 unrelated blog posts. Once they were all published, you could combine them all into a list that was relevant to a topic or entity. What if these 30 seemly unlinked posts once combined, gave you the 30 things Tom Cruise has spent his money on? All of a sudden, this content becomes relevant in an entirely different way that it was when it was packaged separately in an unconnected and uninteresting way.
That is a wild example. I am just trying to illustrate you don’t build in-depth content in one go unless you have a team of ghostwriters. Success is often built upon a lot of small edits to pages over time.
If you haven’t been building topical identity on your site, you should start now.
Topic & Concept Optimisation
This is how I categorise things (and have done for many years).
- Is your website about “topic”?
- Is your business operate within this “topic”?
- Are particular pages on your site about ‘sub-topics’ off the main ‘topic.’
- Do the pages explain the ‘sub-topic’ in detail, including ‘concepts’ within the main sub-topic area?
- Does your content explain concepts in a way that demonstrates experience or expertise?
- Does the page satisfy the user intent better than competing pages?
- Do multiple pages demonstrate expertise or experience in in-depth ways?
- Does this content have independent links from other ‘topic’ related authorities and entities?
That kind of ‘old-SEO’ still works fine.
Google is measuring how satisfied users are with the pages on your website, in comparison with competing pages in the SERP.
Google might still be interested in the reputation of individual authors.
Google Authorship (a push by Google to identify and highlight expert authors using special markup code in your HTML) used to be useful in a number of ways, not limited to achieving increased visibility in SERPs for your writing.
Google Authorship pictures were completely removed from the search results in the UK on June 2014 – but I still have my redundant Google authorship markup in place. SEOs expect authorship analysis in some shape or form to be impactful in Google ranking (it is in Google In-depth Articles, we have been told.)
I imagine Google has moved beyond this now, though, in web search, but is probably still interested in identifying expert authors. We know they use, at least, manual raters to evaluate E.A.T. (Expertise, Authority and Trust”).
There are many signals that no doubt helps Google work this out in 2016, perhaps through an author’s semantic association with certain entities and topics.
Optimising For User Intent Satisfaction
Brand Searches Imply Trust & Quality
A patent granted to Google tells us explicitly what features it was looking for in a site that might seem to indicate that the site was a quality site.
It tells us:
The score is determined from quantities indicating user actions of seeking out and preferring particular sites and the resources found in particular sites. A site quality score for a particular site can be determined by computing a ratio of a numerator that represents user interest in the site as reflected in user queries directed to the site (e.g. ‘Hobo’) and a denominator that represents user interest in the resources found in the site as responses to queries of all kinds (e.g. ‘SEO tutorial’). The site quality score for a site can be used as a signal to rank resources, or to rank search results that identify resources, that are found in one site relative to resources found in another site.
A forward thinking strategy must take this into consideration. This understanding of how Google is moving to rank pages in future is not something that is going away anytime soon.
Note, also, that even if the above patent is in operation, or something like it, SERP modification will still likely be influenced by geographic location, to give you only one example.
- GOAL – Meet “user actions of seeking out and preferring particular sites” Have a brandable website or section of the website that would suit this type of search.
- GOAL – Meet “resources found in particular sites” – Generate content that meets this goal. In-depth content that will meet keyword requirements. Tools that help perform a user requirement, keeping in mind this will be judged “relative to resources found in another site” e.g. your competition and their competing pages.
- GOAL – Create secondary content that transforms gracefully to other media with an intent that informs the user to your service or product. Videos, Animations, Infographics. Other relevant media.
- GOAL – Improve brand signal from a technical SEO point of view. Optimisation and integration tasks with Google properties, for example (e.g. Google Local Business).
- GOAL – Improve user experience. Optimise the search experience.
- GOAL – Conversion Optimisation. Critical to try and convert more sales from the current traffic.
Optimising For The Long Click
When it comes to rating user satisfaction, there are a few theories doing the rounds at the moment that I think are sensible. Google could be tracking user satisfaction by proxy. When a user uses Google to search for something, user behaviour from that point on can be a proxy of the relevance and relative quality of the actual SERP.
What is a Long Click?
A user clicks a result and spends time on it, sometimes terminating the search.
What is a Short Click?
A user clicks a result and bounces back to the SERP, pogo-sticking between other results until a long click is observed. Google has this information if it wants to use it as a proxy for query satisfaction.
For more on this, I recommend this article on the time to long click.
Optimise Supplementary Content on the Page
Once you have the content, you need to think about supplementary content and secondary links that help users on their journey of discovery.
That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic – you should be LINKING OUT to other helpful resources e.g. other websites.A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.
A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.
- TASK – On informational pages, LINK OUT to related pages on other sites AND on other pages on your own website where RELEVANT
- TASK – For e-commerce pages, ADD RELATED PRODUCTS.
- TASK – Create In-depth Content Pieces
- TASK – Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links
- TASK – Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject
- TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
Does Google Prefer Fresh Content?
We know that Google likes ‘fresh content’ if the query deserves fresh results.
Updating content on your site may telegraph to Google signals that the site is being maintained, and have second-order benefits.
Telling users when website copy was last edited is a probable ‘good user experience’, for instance.
Click Metrics and Task Completion Rate are LIKELY Ranking Modifiers
Are people actively seeking your website out, and when they do land there, are they satisfied?
e.g.using the Hobo site as an example, Is anyone looking for “Hobo SEO tutorial” and when they find it, are the completing the task? e.g. are they satisfied, do they hang around, do they share it. Is the page fit for purpose? Or do they immediately go back to Google and click on another result?
If there is a high satisfaction rate when people search for ‘hobo SEO tutorial’ compared to averages for ‘SEO tutorial’, then perhaps Google would be confident of actually showing that page in the mix for the actual term ‘SEO tutorial’ e.g. you would not need links. If there is a low satisfaction rate for such a keyword, then the page can be deemed ‘low-quality’ when compared to better-performing pages on competing sites.
When you satisfy a query in terms of quality and click through satisfaction – or make Google look good – you can actually ‘win’ a branded recognition for said valuable term.
We had the following brand highlight in the UK (for a few months, after our own content strategy hit the front page of Reddit last year).
I used to think Pagerank was like having a ticket to the party. It still might be on some level, but at some level keywords on page plays a similar role. Keywords used to be everything even a few years ago – but that has been at least obfuscated.
We tested this recently and was quite surprised by the difference e.g. you can’t simply add new words to a high authority page and expect it to rank just because you have ‘domain authority’ – not if there are a lot of higher quality pages targeting the term.
Site quality scores are one thing, but these site scores are tied to a KEYWORD query at EVERY TURN (or they are supposed to be). If you mention a SPECIFIC KEYWORD in a multi keyword search, the HIGHEST QUALITY page for that SPECIFIC KEYWORD will often be returned, even with an irrelevant word thrown in. (But not two or more, apparently, as I noted in recent tests. Google is built on previous incarnations of how Google mapped the original web. e.g. for longtail, you still NEED keywords on a page. ergo – the more relevant UNIQUE KEYWORDS (not key phrases) on a page, the more long tail searches it WILL appear for.)
e.g. you can have a very high-quality site, but a site with no proper keyword usage throughout the site won’t rank for any long tail searches e.g., how you are found in Google if you have a new site.
- TASK – [BRAND] + [PRIMARY KEYWORD] GOALS
- PROJECT GOAL – Get customers to search for your ‘brand’ and the ‘primary keyword’.
- PROJECT TASK – Build an internal information based page optimised for the ‘primary keyword’, with as many unique words on it as possible. Cite references. Have ‘Author’ with link to Bio that illustrates E.A.T. (Expertise, Authority and trust). Have ‘Last Updated’ on page (specific to page). Keep updated.
- PROJECT GOAL – Keep pages targeting the primary keyword to a minimum on the site. Only high-quality pages should target the primary keyword.
- PROJECT GOAL – Get legitimate reviews. Google Local Business reviews is a safe bet. Do not ASK for positive reviews on any third party review site… in any obvious manner.
- PROJECT TASK – Determine primary keywords to target with resources on site
- PROJECT GOAL – Get customers to search for your ‘brand’ and the ‘primary keyword’.
Conversion Rate Optimisation in All Areas
However you are trying to satisfy users, many think this is about terminating searches via your site or on your site, or satisfying the long-click.
How you do that in an ethical manner (e.g. not breaking the back button on browsers) the main aim is to satisfy that user somehow.
You used to rank by being a virtual PageRank black hole. Now, you need to think about being a User black hole.
You want a user to click your result in Google, and not need to go back to Google to do the same search that ends with the user pogo sticking to another result, apparently unsatisfied with your page.
The aim is to convert users into subscribers, returning visitors, sharing partners, paying customers or even just help them along in their way to learn something.
The success I have had in ranking pages and getting more traffic have largely revolved around optimising the technical framework of a site, crawl and indexing efficiency, removal of outdated content, content re-shaping, constant improvement of text content to meet its purpose better, internal links to relevant content, conversion optimisation or getting users to ‘stick around’ – or at least visit where I recommend they visit.
Mostly – I’ve focused on satisfying user intent because Google isn’t going backwards with that.
You don’t need only to stick to one topic area on a website. That is a myth.
If you create high-quality pieces of informative content on your website page-to-page, you will rank.
The problem is – not many people are polymaths – and this will be reflected in blog posts that end up too thin to satisfy users and in time, Google, or e-commerce sites that sell everything and have speciality and experience in little of it.
The only focus with any certainty in 2016 is whatever you do, stay high-quality with content.
For some sites, that will mean reducing pages on many topics to a few that can be focused on so that you can start to build authority in that subject area.
Your website is an entity. You are an entity. Explore concepts. Don’t repeat stuff. Be succinct.
You are what keywords are on your pages.
You rank as a result of others rating your writing.
Avoid toxic visitors. A page must meet its purpose well, without manipulation. Do people stay and interact with your page or do they go back to Google and click on other results? A page should be explicit in its purpose and focus on the user.
The number 1 ‘user experience’ signal you can manipulate with low risk is improving content until it is more useful or better presented than is found on competing pages for various related keyword phrases.