Disclosure: “This article is personal opinion of research based on my experience of almost 20 years. There is no third party advertising on this page or monetised links of any sort. External links to third party sites are moderated by me. Disclaimer.” Shaun Anderson, Hobo
QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors.” Google, 2017
There is no one particular way to create web pages that successfully rank in Google but you must ensure:
QUOTE: “that your content kind of stands on its own” John Mueller, Google 2015
If you have an optimised platform on which to publish it, high-quality content is the number 1 user experience area to focus on across websites to earn traffic from Google in 2018.
If you have been impacted by Google’s content quality algorithms, your focus should be on ‘improving content’ on your site rather than deleting content:
If you need help with optimising your website content, I can offer this as a service. If you want to learn how to achieve it yourself, read on.
Table Of Contents
Google Has Evolved and Content Marketing With It
QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google
Google does not work only the way it used to work, and as a result, this impacts a lot of websites built a certain way to rank high in Google – and Google is a lot less forgiving these days.
Is the user going to be more satisfied with an exact match query on a low-quality website, OR a high-quality page closely related to the search query used, published by an entity Google trusts and rates highly?
Google is deciding more and more to go with the latter.
Optimisation, in 2018, must not get in the way of the main content of a page or negatively impact user experience.
When You Add Pages To A Website, “you spread the value across the whole set, and all of them get a little bit less.”
This statement from Google is incredibly important:
QUOTE: “Usually you can’t just arbitrarily expand a site and expect the old pages to rank the same, and the new pages to rank as well. All else being the same, if you add pages, you spread the value across the whole set, and all of them get a little bit less. Of course, if you’re building up a business, then more of your pages can attract more attention, and you can sell more things, so usually “all else” doesn’t remain the same. At any rate, I’d build this more around your business needs — if you think you need pages to sell your products/services better, make them.” John Mueller, Google 2018
When you expand the content on a website, you apparently dilute the ‘value’ you already have, “all else being the same“. This is a good statement to have. I have long thought this was the direction we were heading in.
This certainly seems to be an ‘answer‘ to ‘domain authority‘ abuse and ‘doorway page‘ abuse. It is also going to make webmasters think twice about the type of “SEO friendly content” they publish.
If your pages are low-quality, and you add more pages to your site of a similar quality, then your overall low-quality value Google assigns to your website is lowered still. That value is probably based on content relevance and E.A.T. (high-quality backlinks and expertise) and that is generally a measure of how well you deliver for Google visitors.
We already know that low-quality content on one part of a website can impact rankings for other keywords on other (even high-quality) pages on the same website. I go into this later.
When making ‘SEO-friendly’ content you need to ask yourself:
- Is this new ‘SEO-friendly’ content going to pick up links? Do it.
- Is this new content going to be useful for your current website visitors? Do it, carefully.
- Is this new content autogenerated across many pages? Think twice. Expect value to be diluted in some way.
- Does the site already have lots of low-quality content? Deal with that, first, by improving it. Get rid of stale and irrelevant content.
- Do I have the keywords I want to rank for on the site? Get those on pages you already have without keyword stuffing and prepare to make a high-quality site PLUS high-quality content (as these are 2 different factors) to target specific keyword phrases.
I’ve advised clients for a long time it is incredibly worth investing in some higher-quality long-form in-depth content for a few pages on a site. It is a great way to get new links AND to add keywords to your site in a way that is useful to Google algorithms and human visitors and without the risk of adding low-quality pages to your site.
This concept is a bit like a leaky or reversed version of Pagerank applied ON-SITE. In the original patent, I believe Pages did not ‘lose’ PR (in a general sense), they ‘donated’ PR to other pages in the ‘set’. More pages used to result in more PR. If we think ‘value’ is the new PR score, then the more pages you add to your website (all else remaining the same) the primary way Google measures your site ‘score’ means that that quality score is coming down.
You also need to ask yourself WHY Google would rank 10,000 or 100,000 of your pages for free, just because you tweaked a few keywords? Herein lies the answer to many ranking and indexing problems in 2018.
TAKEAWAY: Don’t add lots of pages to your site where the single purpose is to meet known Google keyword relevance signals. You may get de-valued for it. Do add content to your site that has a clear purpose for your users.
Successful Content Marketing Generates More Visitors From Google
Content marketing, for me, is THE most legitimate and best ‘link building’ strategy in 2018 you as a business can employ to build links to your site. I use content marketing for this very site and you can see in the graph above (from Majestic) the results from such content marketing activity.
The (new version of the) article you are reading right now is an example of ‘content marketing‘ and it has over the years earned me lots of backlinks. I’ve focused solely on using content marketing strategies since 2012 to drive 100% of the link building activity to this site, and I enjoy it.
It is simple, too, but more importantly, risk-free. When I want to build links I simply write an article on this blog.
I use the following strategy and simply ‘rinse and repeat’:
- Pick a topic relevant to the subject matter of my blog (which is Google SEO)
- Offer my advice on the topic based on my 20 years experience
- Include original research on the topic
- Curate the best up-to-date advice out there on the topic from other experts (citing my sources as I go)
- Publish my article on my website and to my newsletter (which 40,000 people are actively subscribed to via Feedburner)
- Automatically syndicate my blog post to Twitter, Facebook and Linkedin to boost social activity
And that’s basically it.
What you will find, if you do it right, and you are passionate about your subject matter and clear about the purpose of your blog, is that by simply creating content on your website will generate natural backlinks to your site IF the content resonates with the target audience.
This strategy suits my aims and has worked for me. I am always learning – always consuming content. I do, however, aim to spend 10-20% of my time in ‘production’ mode, rather than ‘consumption’ mode.
Content marketing coupled with outreach (the act of actively promoting your content to those who might link to it) is a very powerful link building tactic indeed although I have managed to largely get by without that element on this blog. Check out my article blogging for SEO benefits.
I like making stuff, and when you make stuff people will link to it, and over time, that practice will inform Google of your level of E-A-T.
Is Your Content ‘useful for Google users‘?
QUOTE: “Our Webmaster Guidelines advise you to create websites with original content that adds value for users.” Google, 2018
Content quality is one area to focus on if you are to avoid demotion in Google.
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2016
This article aims to cover the most significant challenges of writing ‘SEO-friendly’ text and web page copy for Google in 2018. High-quality content is one aspect of a high-quality page on a high-quality site.
SEO is NO LONGER about adding keywords to pages with 300 words of text. In fact, that practice can be toxic to a site.
Your content needs to be useful to Google users.
If you run an affiliate website or have content that appears on other sites, this is even more important.
QUOTE: “This is particularly important for sites that participate in affiliate programs. Typically, affiliate websites feature product descriptions that appear on sites across that affiliate network. As a result, sites featuring mostly content from affiliate networks can suffer in Google’s search rankings, because they do not have enough added value content that differentiates them from other sites on the web.” Google, 2018
QUOTE: “Google believes that pure, or “thin,” affiliate websites do not provide additional value for web users, especially (but not only) if they are part of a program that distributes its content across a network of affiliates. These sites often appear to be cookie-cutter sites or templates the same or similar content replicated within the same site, or across multiple domains or languages. Because a search results page could return several of these sites, all with the same content, thin affiliates create a frustrating user experience.” Google, 2018
An example of a ‘thin-affiliate‘ site is a site where “product descriptions and reviews are copied directly from the original merchant without any original content or added value” and “where the majority of the site is made for affiliation and contains a limited amount of original content or added value for users”.
Google says that “Good affiliates add value, for example by offering original product reviews, ratings, navigation of products or categories, and product comparisons“.
Google offers us the following advice when dealing with sites with low-value content:
QUOTE: “Affiliate program content should form only a minor part of the content of your site if the content adds no additional features.”
QUOTE: “Ask yourself why a user would want to visit your site first rather than visiting the original merchant directly. Make sure your site adds substantial value beyond simply republishing content available from the original merchant.”
QUOTE: “The more targeted the affiliate program is to your site’s content, the more value it will add and the more likely you will be to rank better in Google search results.”
QUOTE: “Keep your content updated and relevant. Fresh, on-topic information increases the likelihood that your content will be crawled by Googlebot and clicked on by users.”
Is The Main Content (MC) On Your Page ‘High-Quality”?
QUOTE: “(Main CONTENT) is (or should be!) the reason the page exists.”Google Search Quality Evaluator Guidelines 2017
What Is Google Focused On?
Google is concerned with the PURPOSE of a page, the MAIN CONTENT (MC) of a page, the SUPPLEMENTARY CONTENT of a page and HOW THAT PAGE IS monetised, and if that monetisation impacts the user experience of consuming the MAIN CONTENT.
Webmasters need to be careful when optimising a website for CONVERSION first if that gets in the way of a user’s consumption of the main content on the page.
Google also has a “Page Layout Algorithm” that demotes pages with a lot of advertising “above the fold” or that forces users to scroll past advertisements to get to the Main Content of the page.
High-quality supplementary content should “(contribute) to a satisfying user experience on the page and website.” and it should NOT interfere or distract from the MC.
Google says,“(Main CONTENT) is (or should be!) the reason the page exists.” so this is probably the most important part of the page, to Google.
Is The Main Content (MC) On Your Page ‘Unique Content’, ‘Duplicated Content’, ‘Copied Content’, ‘Spun Content’ or ‘Automated Content’?
QUOTE: “Duplicated content is often not manipulative and is commonplace on many websites and often free from malicious intent. Copied content can often be penalised algorithmically or manually. Duplicate content is not penalised, but this is often not an optimal set-up for pages, either. Be VERY careful ‘spinning’ ‘copied’ text to make it unique!” Hobo, 2018
If “copied content” is a negative ranking signal, then “unique content” is going to perform better in Google in 2108. If duplicate content is filtered, then synonymised or spun content is going to be penalised.
Unique content on a site is not filtered out of SERPs like duplicate content or penalised like copied content is.
Copied content can be a low-quality indicator. Boilerplate (especially spun) text can be another even worse low-quality indicator.
Google clearly says that the practice of making your text more ‘unique’, using low-quality techniques like adding synonyms and related words is:
QUOTE: “probably more counter productive than actually helping your website” John Mueller, Google
If you are creating doorway pages using autogenerated content you are going to struggle to get them indexed fully.
If your website is tarnished with these practices – it is going to be classed ‘low-quality’ by some part of the Google algorithm:
- If all you have on your page are indicators of low-quality – you have a low-quality page in 2018 – full stop.
- If your entire website is made up of pages like that, you have a low-quality website.
- If you have manipulative backlinks, then that’s a recipe for disaster.
How To Find Important Keywords To Include On Your Web Page
If users are satisfied with a page and the page outperforms competing pages, then Google will probably be satisfied. You need to make a page relevant to a keyword phrase in the first place and to do that you will need to know which keyword phrase is most valuable to you on a page-to-page basis. You still need to do keyword research to identify what to write about and what type of content to create.
Keyword research is going to be important in optimising high-quality pages or planning a content marketing strategy.
How Often Should Important Keywords Appear On Your Page?
You only need to mention related terms, and sometimes even Head Terms, ONCE in page content to have an impact on relevance and rank for lots of different long-tail key phrases. I still repeat phrases a few times on any page.
A perfect keyword density is a myth, but a demotion for stuffing keywords irresponsibly into text is not.
Keyword Stuffing is clearly against Google guidelines – so I avoid it in 2018.
Instead of thinking how many times to repeat keyword phrases, I’ve had success thinking the opposite over the last few years. That is, how few times can I repeat a primary key phrase on a page to rank it in Google.
It’s not how often to repeat keyword phrases; it is where they are placed throughout the document, and how prominent they are.
How Many Words Can You Optimise Your Page For?
This is going to depend on too many things to give you a general number.
You should probably not think like that, too much, though. A page should rank for the Head Terms (which are in the Title Tag) and the page copy itself should be optimised for as many other closely related keyword phrases as possible (all part of the natural long-tail of the Head Term.).
Can You Just Write Naturally and Rank High in Google?
Yes, you must write naturally (and succinctly) in 2018, but if you have no idea the keywords you are targeting, and no expertise in the topic, you will be left behind those that can access this experience.
You can just ‘write naturally’ and still rank, albeit for fewer keywords than you would have if you optimised the page.
There are too many competing pages targeting the top spots not to optimise your content.
Naturally, how much text you need to write, how much you need to work into it, and where you ultimately rank, is going to depend on the domain reputation of the site you are publishing the article on.
Do You Need Lots of Text To Rank Pages In Google?
How Much Text Do You Need To Write For Google?
How much text do you put on a page to rank for a certain keyword?
Well, as in so much of SEO theory and strategy, there is no optimal amount of text per page, and it is going to differ, based on the topic, and content type, and SERP you are competing in.
Instead of thinking about the quantity of the Main Content (MC) text, you should think more about the quality of the content on the page. Optimise this with searcher intent in mind.
There is no minimum amount of words or text to rank in Google. I have seen pages with 50 words out-rank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other ‘strategy’. Google is a lot better at hiding away those pages in 2018, though.
At the moment, I prefer long-form pages and a lot of text, still focused on a few related keywords and keyphrases to a page. Useful for long tail key phrases and easier to explore related terms.
Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a user’s search query.
I don’t care how many words I achieve this with and often I need to experiment with a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google.
One thing to note – the more text you add to the page, as long as it is unique, keyword rich and relevant to the topic, the more that page will be rewarded with more visitors from Google.
There is no optimal number of words on a page for placement in Google.
Every website – every page – is different from what I can see. Don’t worry too much about word count if your content is original and informative. Google will probably reward you on some level – at some point – if there is lots of unique text on all your pages.
Google said there is no minimum word count when it comes to gauging content quality.
QUOTE: “There’s no minimum length, and there’s no minimum number of articles a day that you have to post, nor even a minimum number of pages on a website. In most cases, quality is better than quantity. Our algorithms explicitly try to find and recommend websites that provide content that’s of high quality, unique, and compelling to users. Don’t fill your site with low-quality content, instead work on making sure that your site is the absolute best of its kind.” John Mueller Google, 2014
However, the quality rater’s guide does state:
6.2 Unsatisfying Amount of Main Content
Some Low quality pages are unsatisfying because they have a small amount of MC for the purpose of the page. For example, imagine an encyclopedia article with just a few paragraphs on a very broad topic such as World War II. Important: An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating.
Does Your Page Content Satisfy User Search Intentions?
User search intent is a way marketers describe what a user wants to accomplish when they perform a Google search.
SEOs have understood user search intent to fall broadly into the following categories and there is an excellent post on Moz about this.
- Transactional – The user wants to do something like buy, signup, register to complete a task they have in mind.
- Informational – The user wishes to learn something
- Navigational – The user knows where they are going
The Google human quality rater guidelines modify these to simpler constructs:
SO – how do you optimise for all this?
You could rely on old-school SEO techniques, but Google doesn’t like thin pages in 2018, and you need higher quality unnatural links to power low-quality sites these days. That is all a risky investment.
Google has successfully made that way forward a minefield for smaller businesses in 2018.
A safer route with a guaranteed ROI, for a real business who can’t risk spamming Google, is to focus on satisfying user satisfaction signals Google might be rating favourably.
You do this by focusing on meeting exactly the intent of an individual keyword query
How. Why and Where from a user searches are going to be numerous and ambiguous and this is an advantage for the page that balances this out better than competing pages in SERPs.
High-quality copywriting is not an easy ‘ask’ for every business, but it is a tremendous leveller.
Anyone can teach what they know and put it on a website if the will is there.
Some understand the ranking benefits of in-depth, curated content, for instance, that helps a user learn something. In-depth pages or long-form content is a magnet for long-tail key phrases.
The high-quality text content of any nature is going to do well, in time, and copywriters should rejoice.
Copywriting has never been more important.
Offering high-quality content is a great place to start on your site.
It’s easy for Google to analyse and rate, and it is also a sufficient barrier to entry for most competitors (at least, it was in the last few years).
Google is looking for high-quality content:
“High quality pages and websites need enough expertise to beauthoritative and trustworthy on their topic.”
..or if you want it another way, Google’s algorithms target low-quality content.
But what if you can’t write to satisfy these KNOW satisfaction metrics?
Luckily – you do not need lots of text to rank in Google.
When a user is actively seeking your page out and selects your page in the SERP, they are probably training Google AI to understand this is a page on a site that satisfies the user intent. This user behaviour is where traditional media and social media promotion is going to be valuable if you can get people to search your site out. This is one reason you should have a short, memorable domain if you can get one.
So, users should be using Google to seek your site out.
‘Do’ Beats ‘Know.’
If you can’t display E.A.T. in your writing, you can still rank if you satisfy users who do search that query.
Last year I observed Google rank a page with 50 words of text on it instead of a page with 5000 words and lots of unique images that target the same term on the same domain.
While there might be something at fault with the ‘optimised’ 5000-word page I have overlooked, the main difference between the two pages was time spent on the page and task completion ‘rate’.
I’ve witnessed Google flip pages on the same domain for many reasons, But it did get me thinking perhaps that Google is thinking users are more satisfied with the DO page (an online tool) with better task completion metrics than the KNOW page (a traditional informational page).
In the end, I don’t need to know why Google is flipping the page, just that it is.
So that means that you don’t always need ‘text-heavy content’ to rank for a term.
You never have of course.
I only offer one example I’ve witnessed Google picking the DO page over the KNOW page, and it surprised me when it did.
It has evidently surprised others too.
There is a post on Searchmetrics that touches on pages with only a little text-content ranking high in Google:
QUOTE: “From a classical SEO perspective, these rankings can hardly be explained. There is only one possible explanation: user intent. If someone is searching for “how to write a sentence” and finds a game such as this, then the user intention is fulfilled. Also the type of content (interactive game) has a well above average time-on-site. SearchMetrics 2016
That’s exactly what I think I have observed, too, though I wouldn’t ever say there is only ‘one possible explanation‘ to anything to do with Google.
For instance – perhaps other pages on the site help the page with no content rank, but when it comes to users being satisfied, Google shows the page with better usage statistics instead, because it thinks it is a win for everyone involved.
This is speculation, of course, and I have witnessed Google flipping pages in SERPs if they have a problem with one of them, for instance, for years.
This is news in January 2016 but I saw it last year and some time ago now. It isn’t entirely ‘new’ to the wild but might be more noticeable in more niches in 2018.
How Much Text Do You Need To Rank?
None, evidently, if you can satisfy the query in an unusual manner without the text.
How To Optimise Content to Meet User Intent & Satisfaction
QUOTE: “Basically you want to create high-quality sites following our webmaster guidelines, and focus on the user, try to answer the user, try to satisfy the user, and all eyes will follow.” Gary Illyes, Google 2016
When it comes to writing SEO-friendly text for Google, we must optimise for user intent, not simply what a user typed into Google.
Google will send people looking for information on a topic to the highest quality, relevant pages it knows about, often BEFORE it relies on how Google ‘used‘ to work e.g. relying on finding near or exact match instances of a keyword phrase on any one page, regardless of the actual ‘quality’ of that page.
Google is constantly evolving to better understand the context and intent of user behaviour, and it doesn’t mind rewriting the query used to serve high-quality pages to users that more comprehensively deliver on user satisfaction e.g. explore topics and concepts in a unique and satisfying way.
Focus on ‘Things’, Not ‘Strings’
QUOTE: “we’ve been working on an intelligent model—in geek-speak, a “graph”—that understands real-world entities and their relationships to one another: things, not strings” Google 2012
Google is better at working out what a page is about, and what it should be about to satisfy the intent of a searcher, and it isn’t relying only on keyword phrases on a page to do that.
Google has a Knowledge Graph populated with NAMED ENTITIES and in certain circumstances, Google relies on such information to create SERPs.
Google has plenty of options when rewriting the query in a contextual way, based on its own synonym knowledge, what you and other people searched for previously, who you are, how you searched and where you are at the time of the search.
If you focus on ‘strings’ you can end up in keyword stuffing territory.
What Is EAT? (“Expertise, Authoritativeness, Trustworthiness)
Google aims to rank pages where the author has some demonstrable expertise on experience in the subject-matter they are writing about. These ‘quality ratings’ (performed by human evaluators) are based on (E.A.T. or EAT or E-A-T) which is simply ‘Expertise, Authoritativeness, Trustworthiness‘ of the ‘Main Content of a page’
QUOTE: “Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. …. Important: Lacking appropriate EAT is sufficient reason to give a page a Low quality rating.” Google Search Quality Evaluator Guidelines 2017
QUOTE: “The amount of expertise, authoritativeness, and trustworthiness (EAT) that a webpage/website has is very important. MC quality and amount, website information, and website reputation all inform the EAT of a website. Think about the topic of the page. What kind of expertise is required for the page to achieve its purpose well? The standard for expertise depends on the topic of the page.” Google Search Quality Evaluator Guidelines 2017
Who links to you can inform the E-A-T of your website.
QUOTE: “I asked Gary (Illyes from Google) about E-A-T. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that’s good. He recommended reading the sections in the QRG on E-A-T as it outlines things well.” Marie Haynes, Pubcon 2018
Google is still a ‘link-based’ search engine ‘under-the-hood’ but it takes so much more to stick a website at the top of search engine results pages (SERPs) in 2018 than it used to.
What Topic Are You Building Machine Identifiable Expertise in?
People don’t know what to blog about, so often, a blog becomes a jumbled mess of un-directed, thin content. You prevent this from happening going forward by knowing what topic you are building out on your website. If everything you publish is related, you are probably building authority in that niche.
If all you are building is content that is immediately out of date and found on other more authoritative websites, you are toast. Keep that stuff on social media, or minimise its footprint on your blog.
EXAMPLE: You could write 30 unrelated blog posts. Once they were all published, you could combine them all into a list that was relevant to a topic or entity. What if these 30 seemly unlinked posts once combined, gave you the 30 things Tom Cruise has spent his money on? All of a sudden, this content becomes relevant in an entirely different way that it was when it was packaged separately in an unconnected and uninteresting way.
That is a wild example. I am just trying to illustrate you don’t build in-depth content in one go unless you have a team of ghostwriters. Success is often built upon a lot of small edits to pages over time.
If you haven’t been building a topical identity on your site, you should start now.
Topic & Concept Optimisation
This is how I categorise things (and have done for many years).
- Is your website content about a “topic” related to your core business?
- Are particular pages on your site about ‘sub-topics’ off the main ‘topic.’
- Do the pages explain the ‘sub-topic’ in detail, including ‘concepts’ within the main sub-topic area?
- Does your content explain concepts in a way that demonstrates experience or expertise?
- Does the page satisfy the user intent better than competing pages?
- Do multiple pages demonstrate expertise or experience in in-depth ways?
- Does this content have independent links from other ‘topic’ related authorities and entities?
That kind of ‘old-SEO’ still works fine.
Google is measuring how satisfied users are with the pages on your website, in comparison with competing pages in the SERP.
Are People Looking For Your Content On Google?
Brand Searches Imply Trust & Quality
A patent granted to Google tells us explicitly what features it was looking for in a site that might seem to indicate that the site was a quality site.
It tells us:
The score is determined from quantities indicating user actions of seeking out and preferring particular sites and the resources found in particular sites. A site quality score for a particular site can be determined by computing a ratio of a numerator that represents user interest in the site as reflected in user queries directed to the site (e.g. ‘Hobo’) and a denominator that represents user interest in the resources found in the site as responses to queries of all kinds (e.g. ‘SEO tutorial’). The site quality score for a site can be used as a signal to rank resources, or to rank search results that identify resources, that are found in one site relative to resources found in another site.
A forward-thinking strategy must take this into consideration. This understanding of how Google is moving to rank pages in future is not something that is going away anytime soon.
Note, also, that even if the above patent is in operation, or something like it, SERP modification will still likely be influenced by geographic location, to give you only one example.
- GOAL – Meet “user actions of seeking out and preferring particular sites” Have a brandable website or section of the website that would suit this type of search.
- GOAL – Meet “resources found in particular sites” – Generate content that meets this goal. In-depth content that will meet keyword requirements. Tools that help perform a user requirement, keeping in mind this will be judged “relative to resources found on another site” e.g. your competition and their competing pages.
- GOAL – Create secondary content that transforms gracefully to other media with an intent that informs the user to your service or product. Videos, Animations, Infographics. Other relevant media.
- GOAL – Improve brand signal from a technical SEO point of view. Optimisation and integration tasks with Google properties, for example (e.g. Google Local Business).
- GOAL – Improve user experience. Optimise the search experience.
- GOAL – Conversion Optimisation. Critical to try and convert more sales from the current traffic.
Click Metrics and Task Completion Rate are LIKELY Ranking Modifiers
Are people actively seeking your website out, and when they do land there, are they satisfied?
e.g.using the Hobo site as an example, Is anyone looking for “Hobo SEO tutorial” and when they find it, are the completing the task? e.g. are they satisfied, do they hang around, do they share it. Is the page fit for purpose? Or do they immediately go back to Google and click on another result?
If there is a high satisfaction rate when people search for ‘hobo SEO tutorial’ compared to averages for ‘SEO tutorial’, then perhaps Google would be confident of actually showing that page in the mix for the actual term ‘SEO tutorial’ e.g. you would not need links. If there is a low satisfaction rate for such a keyword, then the page can be deemed ‘low-quality’ when compared to better-performing pages on competing sites.
When you satisfy a query in terms of quality and click through satisfaction – or make Google look good – you can actually ‘win’ a branded recognition for a valuable term.
We had the following brand highlight in the UK (for a few months, after our own content strategy hit the front page of Reddit last year).
I used to think Pagerank was like having a ticket to the party. It still might be on some level, but at some level keywords on page plays a similar role. Keywords used to be everything even a few years ago – but that has been at least obfuscated.
We tested this recently and was quite surprised by the difference e.g. you can’t simply add new words to a high authority page and expect it to rank just because you have ‘domain authority’ – not if there are a lot of higher quality pages targeting the term.
Site quality scores are one thing, but these site scores are tied to a KEYWORD query at EVERY TURN (or they are supposed to be). If you mention a SPECIFIC KEYWORD in a multi-keyword search, the HIGHEST QUALITY page for that SPECIFIC KEYWORD will often be returned, even with an irrelevant word thrown in. (But not two or more, apparently, as I noted in recent tests. Google is built on previous incarnations of how Google mapped the original web. e.g. for the longtail, you still NEED keywords on a page. ergo – the more relevant UNIQUE KEYWORDS (not key phrases) on a page, the more long tail searches it WILL appear for.)
e.g. you can have a very high-quality site, but a site with no proper keyword usage throughout the site won’t rank for any long tail searches e.g., how you are found in Google if you have a new site.
Google might still be interested in the reputation of individual authors.
Google Authorship (a push by Google to identify and highlight expert authors using special markup code in your HTML) used to be useful in a number of ways, not limited to achieving increased visibility in SERPs for your writing.
Google Authorship pictures were completely removed from the search results in the UK on June 2014 – but I still have my redundant Google authorship markup in place. SEOs expect authorship analysis in some shape or form to be impactful in Google ranking (it is in Google In-depth Articles, we have been told.)
I imagine Google has moved beyond this now, though, in web search, but is probably still interested in identifying expert authors. We know they use, at least, manual raters to evaluate E.A.T. (Expertise, Authority and Trust”).
There are many signals that no doubt helps Google work this out in 2018, perhaps through an author’s semantic association with certain entities and topics.
- TASK – [BRAND] + [PRIMARY KEYWORD] GOALS
- PROJECT GOAL – Get customers to search for your ‘brand’ and the ‘primary keyword’.
- PROJECT TASK – Build an internal information based page optimised for the ‘primary keyword’, with as many unique words on it as possible. Cite references. Have ‘Author’ with a link to Bio that illustrates E.A.T. (Expertise, Authority and trust). Have ‘Last Updated’ on the page (specific to page). Keep updated.
- PROJECT GOAL – Keep pages targeting the primary keyword to a minimum on the site. Only high-quality pages should target the primary keyword.
- PROJECT GOAL – Get legitimate reviews. Google Local Business reviews is a safe bet. Do not ASK for positive reviews on any third party review site… in any obvious manner.
- PROJECT TASK – Determine primary keywords to target with resources on site
- PROJECT GOAL – Get customers to search for your ‘brand’ and the ‘primary keyword’.
How Long Does It Take To Write Seo-Friendly Copy?
Embarrassingly long, in some cases. I am not a professionally trained copywriter, though. I know how to rank pages, and the type of content it takes. The truth is in some verticals; you are writing, from scratch for ONE PAGE (as it needs to be unique), the same amount of text that would have optimised an entire 10,000-page website, just a few years ago. Why? Because your competition will do it, and Google will send them traffic.
I do not see many people pointing that out.
Ranking in Google organic listings is an investment in search engine optimisation AND, at least, one other multi-discipline (of which, there are many); creative copywriting and teaching, or design or engineering, for instance. Copywriting, if you are competing in an informational SERP; Teaching, if you are trying to meet Google’s E.A.T. requirements; design and engineering if you are focusing on satisfying specific user intent (e.g. offering tools, instead of content, for users).
These costs are in 2018 are relatively cost-prohibitive for smaller businesses (who can always choose Google Adwords), but Google is evidently giving the individual, any individual, rather than the business, the opportunity to rank in certain niches who has the drive, knowledge and time to invest in some machine detectable time-spent endeavour that satisfies users better than competing pages.
Businesses can still, of course, compete.
Note the importance of expertise in a chosen subject matter:
6.3 Lacking Expertise, Authoritativeness, or Trustworthiness (E-A-T)
QUOTE: “Some topics demand expertise for the content to be considered trustworthy. YMYL topics such as medical advice, legal advice, financial advice, etc. should come from authoritative sources in those fields. Even everyday topics, such as recipes and housecleaning, should come from those with experience and everyday expertise in order for the page to be trustworthy. You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.”
Content like this will not come cheap in the future.
Long Tail Traffic Is Hiding In Long Form Content
Google didn’t kill the long tail of traffic, though since 2010 they have been reducing the amount of such traffic they will send to certain sites.
In part, they shifted a lot of long tail visitors to pages that Google thought may satisfy their user query, RATHER than just rely on particular exact and near match keywords repeated on a particular page.
At the same time, Google was hitting old school SEO tactics and particularly thin or overlapping pages. So – an obvious strategy and one I took was to identify the thin content on a site and merge it into long-form content and then rework that to bring it all up-to-date.
Long-form content is a magnet for long tail searches and helps you rank for much more popular (head) keywords. The more searchers and visitors you attract, the more you can ‘satisfy’ and the better chance you can rank higher in the long run.
Do you NEED long-form pages to rank?
No – but it can be very useful as a base to start a content marketing strategy if you are looking to pick up links and social media shares.
And be careful. The longer a page is, the more you can dilute it for a specific keyword phrase, and it’s sometimes a challenge to keep it updated.
Google seems to penalise stale or unsatisfying content.
How To SEO In-depth Content
From a strategic point of view, if you can explore a topic or concept in an in-depth way you must do it before your competition. Especially if this is one of the only areas you can compete with them on.
Here are some things to remember about creating topic-oriented in-depth content:
- In-depth content needs to be kept updated. Every six months, at least. If you can update it a lot more often than that – it should be updated more
- In-depth content can reach tens of thousands of words, but the aim should always be to make the page concise as possible, over time
- In-depth content can be ‘optimised’ in much the same way as content has always been optimised
- In-depth content can give you authority in your topical niche
- Pages must MEET THEIR PURPOSE WITHOUT DISTRACTING ADS OR CALL TO ACTIONS. If you are competing with an information page – put the information FRONT AND CENTRE. Yes – this impacts negatively on conversions in the short term. BUT – these are the pages Google will rank high. That is – pages that help users first and foremost complete WHY they are on the page (what you want them to do once you get them there needs to be of secondary consideration when it comes to Google organic traffic).
- You need to balance conversions with user satisfaction unless you don’t want to rank high in Google in 2018.
Optimising For Topics And Concepts
Old SEO was, to a large extent, about repeating text. New SEO is about user satisfaction.
Google’s punishment algorithms designed to target SEO are all over that practice these days. And over a lot more, besides, in 2018.
- Google’s looking for original text on a subject matter that explores the concept that the page is about, rather than meets keyword relevance standards of yesteryear.
- If your page rings these Google bells in your favour, Google knows your page is a good resource on anything to do with a particular concept – and will send visitors to it after invisibly rewriting the actual search query that the user made. Google is obfuscating the entire user intent journey.
For us at the receiving end, it all boils down to writing content that meets a specific user intent and does it better than competing pages.
We are not trying to beat Google or RankBrain, just the competition.
Pages looking to, genuinely, help people are a good user experience. At the page level, satisfying informational search intent is still going to be about keyword analysis at some level.
SEO is about understanding topics and concepts as search engines try to.
A well-optimised topic/concept oriented page that meets high relevance signals cannot really fail to pick up search traffic and, if it’s useful to people, pick up UX signals that will improve rankings in Google (I include links in that).
Stale Content Can Negatively Impact Google Rankings
Copywriters and researchers rejoice!
Here is the bad news about ‘quality content’. Even it, including in-depth articles, entropies and becomes stale over time.
This is, however, natural.
You cannot expect content to stay relevant and perform at the same levels six months or a year down the line.
Even high-quality content can lose positions to better sites and better, more up-to-date content.
In any competitive niche, you are going to be up against this kind of content warfare – and this is only going to get more competitive.
You must, in 2018, be investing in professionally created website content. You must create and keep updated unique, informative, trustworthy and authoritative editorial content.
Once you have keywords in place, you must focus on improving user experience and conversion optimisation.
If you are developing ‘evergreen’ in-depth articles, they too will need periodically updated.
Does Google Prefer Fresh Content?
Most of the time, yes.
Google even has “initiatives” for models that determine if some keyword searches are better served with results based on Q.D.F (query deserves freshness for that keyword phrase).
QUOTE: Mr. Singhal introduced the freshness problem, explaining that simply changing formulas to display more new pages results in lower-quality searches much of the time. He then unveiled his team’s solution: a mathematical model that tries to determine when users want new information and when they don’t. (And yes, like all Google initiatives, it had a name: QDF, for “query deserves freshness.”) New York Times h/t Search Engine Land
We know that Google likes ‘fresh content’ if the query deserves fresh results. Not all businesses need ‘freshness’, we are told. We know also that Google changed their infrastructure to better provide users with “50 percent fresher results”
QUOTE: “So why did we build a new search indexing system? Content on the web is blossoming. It’s growing not just in size and numbers but with the advent of video, images, news and real-time updates, the average webpage is richer and more complex. In addition, people’s expectations for search are higher than they used to be. Searchers want to find the latest relevant content and publishers expect to be found the instant they publish.” Google, 2010
The real risk to your rankings could be content staleness:
“Stale content refers to documents that have not been updated for a period of time and, thus, contain stale data (documents that are “no longer updated, diminished in importance, superceded by another document“). The staleness of a document may be based on: document creation date, anchor growth, traffic, content change, forward/back link growth, etc. The Google patent explains how they can spot the stale content using 4 factors: Query-based factor; Link-based criteria; Traffic – based criteria; User-behavior-based criteria.” Search Engine Journal, 2008
The real risk of stale content is that out-dated content is a poor user experience to visitors. If you have stale content your competitor will potentially get more traffic from Google with better kept up-to-date content that pleases users.
This is in the also guidelines for affiliate websites:
QUOTE: “Keep your content updated and relevant. Fresh, on-topic information increases the likelihood that your content will be crawled by Googlebot and clicked on by users.” Webmaster Guidelines: Affiliate Programs, 2018
Updating content on your site may signal to Google that your site is being maintained, and have second-order benefits, too.
Telling users when website copy was last edited or last updated is a probable ‘good user experience’, for instance.
Users also prefer relevant content that is actually up-to-date.
Consider that Google has patents to automatically detect many aspects of poor user experience (as Google defines it).
The sensible thing would be to keep your content ‘fresh’ in some way, by either updating the content or improving the user experience.
Low-Quality Content On Part of A Web Site Can Affect Rankings For The Same Website On More Important Keyword Rankings
Moz has a good video on the Google organic quality score theory. You should watch it. It goes into a lot of stuff I (and others) have been blogging for the last few years, and some of it is relevant to the audits I produce, an example of which you can see here (Alert! 2mb in size!).
One thing that could have been explained better in the video was that Moz has topical authority worldwide for ‘Google SEO’ terms, hence why they can rank so easily for ‘organic quality score’.
But the explanation of the quality score is a good introduction for beginners.
I am in the camp this organic quality score has been in place for a long time, and more and more are feeling the results from it.
This is also quite relevant to a question answered last week in the Google Webmaster Hangout which was:
“QUESTION – Is it possible that if the algorithm doesn’t particularly like our blog articles as much that it could affect our ranking and quality score on the core Content?”
resulting in an answer:
“ANSWER: JOHN MUELLER (GOOGLE): Theoretically, that’s possible. I mean it’s kind of like we look at your web site overall. And if there’s this big chunk of content here or this big chunk kind of important wise of your content, there that looks really iffy, then that kind of reflects across the overall picture of your website. But I don’t know in your case, if it’s really a situation that your blog is really terrible.”
Google has introduced (at least) a ‘percieved’ risk to publishing lots of lower-quality pages on your site to in an effort to curb production of old-style SEO friendly content based on manipulating early search engine algorithms.
We are dealing with algorithms designed to target old style SEO – that focus on the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of Keywords equals LOTS of Google traffic.
A big site can’t just get away with publishing LOTS of lower quality content in the cavalier way they used to – not without the ‘fear’ of primary content being impacted and organic search traffic throttled negatively to important pages on the site.
Google is very probably using user metrics in some way to determine the ‘quality’ of your site.
QUESTION – “I mean, would you recommend going back through articles that we posted and if there’s ones that we don’t necessarily think are great articles, that we just take them away and delete them?”
The reply was:
JOHN MUELLER: I think that’s always an option.Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.
Deleting content is not always the optimal way to handle MANY types of low-quality content – far from it, in fact. Nuking it is the last option unless the pages really are ‘dead‘ content.
Any clean-up should go hand in hand with giving Google something it is going to value on your site e.g. NEW high-quality content:
The final piece of advice is interesting, too.
It gives us an insight into how Google might actually deal with your site:
JOHN MUELLER: “Then maybe that’s something where you can collect some metrics and say, well, everything that’s below this threshold, we’ll make a decision whether or not to significantly improve it or just get rid of it.”
You can probably rely on Google to ‘collect some metrics and say, well, everything that’s below this threshold, we’ll “…(insert punishment spread out over time).
Google probably has a quality score of some sort, and your site probably has a rating whatever that is relevant to (and if you get any real traffic from Google, often a manual rating).
If you have a big site, certain parts of your site will be rated more useful than others to Google.
Improving the quality of your content certainly works to improve traffic, as does intelligently managing your content across the site. Positive results from this process are NOT going to happen overnight. I’ve blogged about this sort of thing for many years, now.
Google are going together better at rating sites that meet their guidelines for ‘quality’ and ‘user satisfaction’ here – I am putting such things in quotes here to highlight the slightly Orwellian doublespeak we have to work with.
What Are The High-Quality Characteristics of a Web Page?
The following are examples of what Google calls ‘high-quality characteristics’ of a page and should be remembered:
- “A satisfying or comprehensive amount of very high-quality” main content (MC)
- Copyright notifications up to date
- Functional page design
- Page author has Topical Authority
- High-Quality Main Content
- Positive Reputation or expertise of website or author (Google yourself)
- Very helpful SUPPLEMENTARY content “which improves the user experience.“
- Google wants to reward ‘expertise’ and ‘everyday expertise’ or experience so you should make this clear on your page
- Accurate information
- Ads can be at the top of your page as long as it does not distract from the main content on the page
- Highly satisfying website contact information
- Customised and very helpful 404 error pages
- Evidence of expertise
- Attention to detail
If Google can detect investment in time and labour on your site – there are indications that they will reward you for this (or at least – you won’t be affected when others are, meaning you rise in Google SERPs when others fall).
What Characteristics Do The Highest Quality Pages Exhibit?
You obviously want the highest quality ‘score’ but looking at the guide that is a lot of work to achieve.
Google wants to rate you on the effort you put into your website, and how satisfying a visit is to your pages.
- QUOTE: “Very high or highest quality MC, with demonstrated expertise, talent, and/or skill.“
- QUOTE: “Very high level of expertise, authoritativeness, and trustworthiness (page and website) on the topic of the page.”
- QUOTE: “Very good reputation (website or author) on the topic of the page.”
At least for competitive niches were Google intend to police this quality recommendation, Google wants to reward high-quality pages and “the Highest rating may be justified for pages with a satisfying or comprehensive amount of very high-quality” main content.
If your main content is very poor, with “grammar, spelling, capitalization, and punctuation errors“, or not helpful or trustworthy – ANYTHING that can be interpreted as a bad user experience – you can expect to get a low rating.
QUOTE: “We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low-quality (main content) do not achieve their purpose well.”
Note – not ALL thin pages are low-quality.
If you can satisfy the user with a page thin on content – you are ok (but probably susceptible to someone building a better page than yours in the near-future).
Google expects more from big brands than they do from your store (but that does not mean you shouldn’t be aiming to meet ALL these high-quality guidelines above.
If you violate Google Webmaster recommendations for performance in their indexes of the web – you automatically get a low-quality rating.
Poor page design and poor main content and too many ads = you are toast.
If a rater is subject to a sneaky redirect – they are instructed to rate your site low.
Pages Can Be Rated ‘Medium Quality’
Quality raters will rate content as medium rating when the author or entity responsible for it is unknown.
If you have multiple editors contributing to your site, you had better have a HIGH EDITORIAL STANDARD.
One could take from all this that Google Quality raters are out to get you if you manage to get past the algorithms, but equally, Google quality raters could be friends you just haven’t met yet.
Somebody must be getting rated highly, right?
Impress a Google Quality rater and get a high rating.
If you are a spammer you’ll be pulling out the stops to fake this, naturally, but this is a chance for real businesses to put their best foot forward and HELP quality raters correctly judge the size and relative quality of your business and website.
Real reputation is hard to fake – so if you have it – make sure it’s on your website and is EASY to access from contact and about pages.
The quality raters handbook is a good training guide for looking for links to disavow, too.
It’s pretty clear.
Google organic listings are reserved for ‘remarkable’ and reputable’ content, expertise and trusted businesses.
A high bar to meet – and one that is designed for you to never quite meet unless you are serious about competing, as there is so much work involved.
I think the inferred message is to call your Adwords rep if you are an unremarkable business.
Are Poor Spelling and Bad Grammar Google Ranking Factors?
Is Grammar A Ranking Factor?
NO – this is evidently NOT a ranking signal. I’ve been blogging for nine years and most complaints I’ve had in that time have been about my poor grammar and spelling in my posts.
My spelling and grammar may be atrocious but these shortcomings haven’t stopped me ranking lots of pages over the years.
Google historically has looked for ‘exact match’ instances of keyword phrases on documents and SEO have, historically, been able to optimise successfully for these keyword phrases – whether they are grammatically correct or not.
So how could bad grammar carry negative weight in Google’s algorithms?
That being said, I do have Grammarly, a spelling and grammar checking plugin installed on my browser to help me catch the obvious mistakes.
Advice From Google
John Mueller from Google said in a recent hangout that it was ‘not really’ but that it was ‘possible‘ but very ‘niche‘ if at all, that grammar was a positive ranking factor. Bear in mind – most of Google’s algorithms (we think) demote or de-rank content once it is analysed – not necessarily promote it – not unless users prefer it.
Another video I found is a Google spokesman talking about inadequate grammar as a ranking factor or page quality signal was from a few years ago.
In this video, we are told, by Google, that grammar is NOT a ranking factor.
Not, at least, one of the 200+ quality signals Google uses to rank pages.
And that rings true, I think.
Google’s Matt Cutts did say though:
QUOTE: “It would be fair to use it as a signal…The more reputable pages do tend to have better grammar and better spelling. ” Matt Cutts, Google
Google Panda & Content Quality
Google is on record as saying (metaphorically speaking) their algorithms are looking for signals of low quality when it comes to rating pages on Content Quality.
Some possible examples could include:
QUOTE: “1. Does this article have spelling, stylistic, or factual errors?”
QUOTE: “2. Was the article edited well, or does it appear sloppy or hastily produced?”
QUOTE: “3. Are the pages produced with great care and attention to detail vs. less attention to detail?”
QUOTE: “4. Would you expect to see this article in a printed magazine, encyclopedia or book?”
Altogether – Google is rating content on overall user experience as it defines and rates it, and bad grammar and spelling equal a poor user experience.
At least on some occasions.
Google aims to ensure organic search engine marketing be a significant investment in time and budget for businesses. Critics will say to make Adwords a more attractive proposition.
Google aims to reward quality signals that:
- take time to build and
- the vast majority of sites will not, or cannot meet without a lot of investment.
NO website in a competitive market gets valuable traffic from Google in 2018 without a lot of work. Technical work and content curation.
It’s an interesting aside.
Fixing the grammar and spelling on a page can be a time-consuming process.
It’s clearly a machine-readable and detectable – although potentially noisy – signal and Google IS banging on about Primary MAIN Content Quality and User Experience.
Grammar is ranking factor could be one for the future – but at the moment, I doubt grammar is taken much into account (on an algorithmic level, at least, although users might not like your grammar and that could have a second order impact if it causes high abandonment rates, for instance).
Is Spelling A Google Ranking Factor?
Poor spelling has always had the potential to be a NEGATIVE ranking factor in Google. IF the word that is incorrect on the page is unique on the page and of critical importance to the search query.
Although – back in the day – if you wanted to rank for misspellings – you optimised for them – so – poor spelling would be a POSITIVE ranking looking back not that long ago.
Now, that kind of optimisation effort is fruitless, with changes to how Google presents these results in 2018.
Google will favour “Showing results for” results over presenting SERPs based on a common spelling error.
Testing to see if ‘bad spelling’ is a ranking factor is still easy on a granular level, bad grammar is not so easy to test.
I think Google has better signals to play with than ranking pages on spelling and grammar. It’s not likely to penalise you for the honest mistakes most pages exhibit, especially if you have met more important quality signals – like useful main content.
And I’ve seen clear evidence of pages ranking very well with both bad spelling and bad grammar. My own!
I still have Grammarly installed, though.
What Is Keyword Stuffing?
Keyword stuffing is simply the process of repeating the same keyword or key phrases over and over in a page. It’s counterproductive. It’s is a signpost of a very low-quality spam site and is something Google clearly recommends you avoid.
QUOTE: “the practice of loading a webpage with keywords in an attempt to manipulate a site’s ranking in Google’s search results“. Google
Keyword stuffing text makes your copy often unreadable and so, a bad user experience. It often gets a page booted out of Google but it depends on the intent and the trust and authority of a site. It’s sloppy SEO in 2018.
It is not a tactic you want to employ in search of long-term rankings.
Just because someone else is successfully doing it do not automatically think you will get away with it.
Don’t do it – there are better ways of ranking in Google without resorting to it.
I used to rank high for ‘what is keyword stuffing‘ years ago, albeit with a keyword stuffed page that I created to illustrate you could, at one time, blatantly ignore Google’s rules and, in fact, use the rules as a blueprint to spam Google and get results.
That type of example page is just asking for trouble in 2018.
At the time of writing I rank for the same term using a more helpful article:
I don’t keyword stuff in 2018 – Google is a little less forgiving than it used to be.
Google has a lot of manual raters who rate the quality of your pages, and keyword stuffed main content features prominently in the 2017 Quality Raters Guidelines:
7.4.2 “Keyword Stuffed” Main Content
QUOTE: ‘Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.’ Search Quality Raters Guidelines March 14, 2017
Google is policing their SERPs.
Put simply Google’s views on ‘site quality’ and ‘user satisfaction’ do NOT automatically correlate to you getting more traffic.
This endeavour is supposed to be a benchmark – a baseline to start from (when it comes to keywords with financial value).
Everybody, in time, is supposed to hit this baseline to expect to have a chance to rank – and for the short, to medium term, this is where the opportunity for those who take it can be found.
If you don’t do it, someone else will, and Google will rank them, in time, above you.
Google has many human quality raters rating your offering, as well as algorithms targeting old style SEO techniques and engineers specifically looking for sites that do not meet technical guidelines.
Does Google Promote A Site Or Demote Others In Rankings?
In the video above you hear from at least one spam fighter that would confirm that at least some people are employed at Google to demote sites that fail to meet policy:
QUOTE: “I didn’t SEO at all, when I was at Google. I wasn’t trying to make a site much better but i was trying to find sites that were not ‘implementing Google policies'(?*) and not giving the best user experience.” (I can’t quite make out what he says there)
Link algorithms seem particularly aggressive, too, and ‘delayed’ now more than ever (for normal businesses built on old-school links) so businesses are getting it from multiple angles as Google rates the quality of its primary index (which may well sit on top of the old stuff and heavily influenced by ‘quality’ signals on your site).
Google Introduced ‘an explicit quality metric which got directly at the issue of quality‘
QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant this is basically the definition of a content form in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics. SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story – Hobo Google Ranking Signals 2017)
Google is on record as saying they had a quality problem that they started fixing with Panda.
That means that many of the sites that were getting a lot of traffic from Google in 2011 were not going to be rated ‘high quality’ as the new quality rating was designed to demote these sites.
In the following video, a Google spokesperson even goes into what they call a ‘quality metric’ signals separated from relevance signals:
These sites used common seo practices to rank, and Google made those practices toxic.
The fun started when these algorithms started to roll out more significantly to the real business world, and continue today with the ever-present (often erroneously confirmed) Google ‘quality’ updates we now constantly hear about in the media.
One thing is for sure, you need to get low-quality content OFF your site and get some UNIQUE content on it to maximise the benefit a few months down the line from *any* other marketing activity (like link building).
From my testing, the performance of pages degrade in time but can come back with regular updates and refresh of content in a sensible manner.
Retargeting keywords can also have a positive impact over time. One marked observation between my top performing pages on my test site is that pages that are updated more often (and improved substantially) tend to get more traffic than pages that do not.
A sensible strategy in 2018 is to:
- Get rid of all substandard pages on the site
- Meet Googles technical recommendations
- Meet rater guidelines high priority issues for low to medium ratings
- Go in-depth on your topic in certain areas
- Consolidate content and link equity where prudent with redirects (not canonicals – use these only as a patch for a few months)
- Start producing some sort of actual unique content
- Market that new content more appropriately going forward than in the past
Most of this stuff I cover in my seo audits.
High-quality content is only one aspect of a high-quality site.
Google Is Not Going To Rank Low-Quality Pages When It Has Better Options
QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google
If you have exact match instances of key phrases on low-quality pages, mostly these pages won’t have all the compound ingredients it takes to rank high in Google in 2018.
I was working this, long before I understood it partially enough to write anything about it.
Here is an example of taking a standard page that did not rank for years and then turning it into a topic-oriented resource page designed around a user’s intent and purpose:
Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.
Google Demotion Algorithms Target Low-Quality Content
Optimising (without improving) low-quality content springs traps set by ever-improving core quality algorithms.
What this means is that ‘optimising’ low-quality pages is very much swimming upstream in 2018.
Optimising low-quality pages without value-add is self-defeating, now that the algorithms – and manual quality rating efforts – have got that stuff nailed down.
If you optimise low-quality pages using old school SEO techniques, you will be hit with a low-quality algorithm (like the Quality Update or Google Panda).
You must avoid boilerplate text, spun text or duplicate content when creating pages – or you are Panda Bamboo – as Google hinted at in the 2015 Quality Rater’s Guide.
QUOTE: “6.1 Low Quality Main Content One of the most important considerations in PQ rating is the quality of the MC. The quality of the MC is determined by how much time, effort, expertise, and talent/skill have gone into the creation of the page. Consider this example: Most students have to write papers for high school or college. Many students take shortcuts to save time and effort by doing one or more of the following:
- Buying papers online or getting someone else to write for them
- Making things up.
- Writing quickly with no drafts or editing.
- Filling the report with large pictures or other distracting content.
- Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there.
- Using commonly known facts, for example, “Argentina is a country. People live in Argentina. Argentina has borders. Some people like Argentina.”
- Using a lot of words to communicate only basic ideas or facts, for example, “Pandas eat bamboo. Pandas eat a lot of bamboo. It’s the best food for a Panda bear.”
Unfortunately, the content of some webpages is similarly created. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well. Important: Low quality MC is a sufficient reason to give a page a Low quality rating.”
Google rewards uniqueness or punishes the lack of it.
The number 1 way to do ‘SEO copywriting‘ in 2018 will be to edit the actual page copy to continually add unique content and improve its accuracy, uniqueness, relevance, succinctness, and use.
Low-Quality Content Is Not Meant To Rank High in Google.
A Google spokesman said not that long ago that Google Panda was about preventing types of sites that shouldn’t rank for particular keywords from ranking for them.
QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.” Gary Illyes – Search Engine Land
When Google demotes your page for duplicate content practices, and there’s nothing left in the way of unique content to continue ranking you for – your web pages will mostly be ignored by Google.
The way I look at it – once Google strips away all the stuff it ignores (duplicate text) – what’s left? In effect, that’s what you can expect Google to reward you for. If what is left is boilerplate synonymised text content – that’s now being classed as web spam – or ‘spinning’.
NOTE – The ratio of duplicate content on any page is going to hurt you if you have more duplicate text than unique content. A simple check of the pages, page to page, on the site is all that’s needed to ensure each page is DIFFERENT (regarding text) page-to-page.
If you have large sections of duplicate text page-to-page – that is a problem that should be targeted and removed.
It is important to note:
- The main text content on the page must be unique to avoid Google’s page quality algorithms.
- Verbose text must NOT be created or spun automatically
- Text should NOT be optimised to a template as this just creates a footprint across many pages that can be interpreted as redundant or manipulative boilerplate text.
- Text should be HIGHLY descriptive, unique and concise
- If you have a lot of pages to address, the main priority is to create a UNIQUE couple of paragraphs of text, at least, for the MC (Main Content). Pages do not need thousands of words to rank. They just need to MEET A SPECIFIC USER INTENT and not TRIP ‘LOW_QUALTY’ FILTERS. A page with just a few sentences of unique text still meets this requirement (150-300 words) – for now.
- When it comes to out-competing competitor pages, you are going to have to look at what the top competing page is doing when it comes to main content text. Chances are – they have some unique text on the page. If they rank with duplicate text, either their SUPPLEMENTARY CONTENT is better, or the competitor domain has more RANKING ABILITY because of either GOOD BACKLINKS or BETTER USER EXPERIENCE.
- Updating content on a site should be a priority as Google rewards fresher content for certain searches.
Google Rates ‘Copied’ Main Content ‘Lowest’
This is where you are swimming upstream in 2018. Copied content is not going to be a long-term strategy when creating a unique page better than your competitions’ pages.
In the latest Google Search Quality Evaluator Guidelines that were published on 14 March 2017, Google states:
7.4.5 Copied Main Content
Every page needs MC. One way to create MC with no time, effort, or expertise is to copy it from another source. Important: We do not consider legitimately licensed or syndicated content to be “copied” (see here for more on web syndication). Examples of syndicated content in the U.S. include news articles by AP or Reuters.
The word “copied” refers to the practice of “scraping” content, or copying content from other nonaffiliated websites without adding any original content or value to users (see here for more information on copied or scraped content).
If all or most of the MC on the page is copied, think about the purpose of the page. Why does the page exist? What value does the page have for users? Why should users look at the page with copied content instead of the original source? Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
7.4.6 More About Copied Content
All of the following are considered copied content:
● Content copied exactly from an identifiable source. Sometimes an entire page is copied, and sometimes just parts of the page are copied. Sometimes multiple pages are copied and then pasted together into a single page. Text that has been copied exactly is usually the easiest type of copied content to identify.
● Content which is copied, but changed slightly from the original. This type of copying makes it difficult to find the exact matching original source. Sometimes just a few words are changed, or whole sentences are changed, or a “find and replace” modification is made, where one word is replaced with another throughout the text. These types of changes are deliberately done to make it difficult to find the original source of the content. We call this kind of content “copied with minimal alteration.”
● Content copied from a changing source, such as a search results page or news feed. You often will not be able to find an exact matching original source if it is a copy of “dynamic” content (content which changes frequently). However, we will still consider this to be copied content. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
How To Do “SEO Copywriting“
Good Content Will Still Need ‘Optimised’
The issue is, original “compelling content” – so easy to create isn’t it(!) – on a site with no links and no audience and no online business authority is as useful as boring, useless content – to Google – and will be treated as such by Google – except for long tail terms (if even).
It usually won’t be found by many people and won’t be READ and won’t be ACTED upon – not without a few good links pointing to the site – NOT if there is any competition for the term.
Generalisations make for excellent link bait and while good, rich content is very important, sayings like ‘content is everything’ is not telling you the whole story.
The fact is – every single site is different, sits in a niche with a different level of competition for every keyword or traffic stream, and needs a strategy to tackle this.
There’s no one size fits all magic button to press to get traffic to a site. Some folk have a lot of domain authority to work with, some know the right people, or have access to an audience already – indeed, all they might need is a copywriter – or indeed, some inspiration for a blog post.
They, however, are in the minority of sites.
Most of the clients I work with have nothing to start with and are in a relatively ‘boring’ niche few reputable blogs write about.
In one respect, Google doesn’t even CARE what content you have on your site (although it’s better these days at hiding this).
Humans do care, of course, so at some point, you will need to produce that content on your pages.
You Can ALWAYS Optimise Content To Perform Better In Google
An SEO can always get more out of content in organic search than any copywriter, but there’s not much more powerful than a copywriter who can lightly optimise a page around a topic, or an expert in a topic that knows how to – continually, over time – optimise a page for high rankings in Google.
If I wanted to rank for “How To Write For Google“? – for instance – in the old days you used to put the key phrase in the normal elements like the Page Title Element and ALT text and then keyword stuffed your text to make sure you repeated “How To Write For Google” enough times in a block of low-quality text.
Using variants and synonyms of this phrase helped to add to the ‘uniqueness’ of the page, of course.
Throwing in any old text would beef the word count up.
Now, in 2018, if I want to rank high in Google for that kind of term – I would still rely on old SEO best practices like a very focused page title – but now the text should explore a topic in a much more informative way.
Writing for Google and meeting the query intent means an SEO copywriter would need to make sure page text included ENTITIES AND CONCEPTS related to the MAIN TOPIC of the page you are writing about and the key phrase you are talking about.
If I wanted a page to rank for this term, I would probably need to explore concepts like Google Hummingbird, Query Substitution, Query Reformation and Semantic Search i.e. I need to explore a topic or concept in fully – and as time goes on – more succinctly – than competing pages.
If you want to rank for a SPECIFIC search term – you can still do it using the same old, well-practised keyword targeting practices. The main page content itself just needs to be high-quality enough to satisfy Google’s quality algorithms in the first place.
This is still a land grab.
Should You Rewrite Product Descriptions To Make The Text Unique?
Whatever you do, beware ‘spinning the text’ – Google might have an algorithm or two focused on that sort of thing:
Tip: Beware Keyword Stuffing E-commerce Website Category Pages To Rank For Various Other Keywords in Google
Google’s John Meuller just helped someone out in this week’s Google Webmaster Hangout, and his answer was very interesting:
QUOTE: “The site was ranking the first page for the keyword (widget) and(widgets) in Australia since two weeks we moved all the way down to page five. Technical changes haven’t been made to the site the only modification was we added more category landing text to rank for various other (keywords)“
QUOTE: “the modification that you mentioned (above) that you put more category landing text on the page that might also be something that’s playing a role there. What I see a lot with e-commerce sites is that they take a category page that’s actually pretty good and they stuff a whole bunch of text on the bottom and that’s essentially just kind of roughly related to that content which is essentially like bigger than the Wikipedia page on that topic and from our point of view when we look at things like that our algorithms kind of quickly kind of back off and say whoa it looks like someone is just trying to use keyword stuffing to include a bunch of kind of unrelated content into the same page and then our algorithms might be a bit more critical and kind of like be cautious with regards to the content that we find on this page so that’s one thing to kind of watch out for.
I think it’s good to / help provide more context to things that you have on your website but kind of be reasonable and think about what users would actually use and focus on that kind of content so for example if if the bottom of these pages is just a collection of keywords and a collection of sentences where those keywords are artificially used then probably users aren’t going to scroll to the bottom and read all of that tiny text and actually use that content in a useful way and then probably search engines are also going to back off and say well this page is is doing some crazy stuff here we don’t really know how much we can trust the content on the page.”
If you are keyword stuffing e-commerce category pages, watch out. Google tells us these things for a reason. Adding optimised text to e-commerce category pages ‘just for the sake of it’ is probably going to work against you (and might be working against you today).
Keyword stuffing has been against the rules for a long time.
John previously stated back during 2016:
QUOTE: “if we see that things like keyword stuffing are happening on a page, then we’ll try to ignore that, and just focus on the rest of the page”.
Google has algorithms AND human reviewers looking out for it when the maths miss it:
7.4.2 “Keyword Stuffed” Main Content
QUOTE: ‘Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.’ Search Quality Raters Guidelines March 14, 2017
While there is obviously a balance to be had in this area, Google classes keyword stuffing as adding ‘irrelevant keywords‘ to your site. There are warnings also about this age-old SEO technique in the general webmaster guidelines:
General Guidelines: Irrelevant Keywords
QUOTE: “Keyword stuffing” refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site’s ranking in Google search results. Often these keywords appear in a list or group, or out of context (not as natural prose). Filling pages with keywords or numbers results in a negative user experience, and can harm your site’s ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.
Examples of keyword stuffing include:
- Lists of phone numbers without substantial added value
- Blocks of text listing cities and states a webpage is trying to rank for
- Repeating the same words or phrases so often that it sounds unnatural, for example:We sell custom cigar humidors. Our custom cigar humidors are handmade. If you’re thinking of buying a custom cigar humidor, please contact our custom cigar humidor specialists at email@example.com.
Are You Making Or Optimising Doorway Pages?
Google algorithms consistently target sites with doorway pages in quality algorithm updates. The definition of a “doorway page” can change over time.
For example in the images below (from 2011), all pages on the site seemed to be hit with a -50+ ranking penalty for every keyword phrase the website ranked for.
At first Google rankings for commercial keyword phrases collapsed which led to somewhat of a “traffic apocalypse“:
The webmaster then received an email from Google via Google Webmaster Tools (now called Google Search Console):
QUOTE: “Google Webmaster Tools notice of detected doorway pages on xxxxxxxx – Dear site owner or webmaster of xxxxxxxx, We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages – groups of “cookie cutter” or low-quality pages. Such pages are often of low value to users and are often optimized for single words or phrases in order to channel users to a single location. We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.” Google Search Quality Team
At the time, I didn’t immediately class the pages on the affected sites in question as doorway pages. It’s evident Google’s definition of a doorways changes over time.
A lot of people do not realise they are building what Google classes as doorway pages….. and it was indicative to me that ….. what you intend to do with the traffic Google sends you may in itself, be a ranking factor not too often talked about.
What Does Google Classify As Doorway Pages?
Google classes many types of pages as doorway pages. Doorway pages can be thought of as lots of pages on a website designed to rank for very specific keywords using minimal original text content e.g. location pages often end up looking like doorway pages.
Its actually a very interesting aspect of modern SEO and one that is constantly shifting.
In the recent past, location-based SERPs were often lower-quality, and so Google historically ranked location-based doorway pages in many instances.
There is some confusion for real businesses who THINK they SHOULD rank for specific locations where they are not geographically based and end up using doorway-type pages to rank for these locations.
What Google Says About Doorway Pages
Google said a few years ago:
QUOTE: “For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn’t like it, and then tries the next result in the search results page and is taken to that same site that they didn’t like, that’s a really frustrating experience.” Google
A question about using content spread across multiple pages and targeting different geographic locations on the same site was asked in the recent Hangout with Google’s John Meuller
QUOTE: “We are a health services comparison website…… so you can imagine that for the majority of those pages the content that will be presented in terms of the clinics that will be listed looking fairly similar right and the same I think holds true if you look at it from the location …… we’re conscious that this causes some kind of content duplication so the question is is this type … to worry about? “
Bearing in mind that (while it is not the optimal use of pages) Google does not ‘penalise’ a website for duplicating content across internal pages in a non-malicious way, John’s clarification of location-based pages on a site targeting different regions is worth noting:
QUOTE: “For the mostpart it should be fine I think the the tricky part that you need to be careful about is more around doorway pages in the sense that if all of these pages end up with the same business then that can look a lot like a doorway page but like just focusing on the content duplication part that’s something that for the most part is fine what will happen there is will index all of these pages separately because from from a kind of holistic point of view these pages are unique they have unique content on them they might have like chunks of text on them which are duplicated but on their own these pages are unique so we’ll index them separately and in the search results when someone is searching for something generic and we don’t know which of these pages are the best ones we’ll pick one of these pages and show that to the user and filter out the other variations of that that page so for example if someone in Ireland is just looking for dental bridges and you have a bunch of different pages for different kind of clinics that offer the service and probably will pick one of those pages and show those in the search results and filter out the other ones.
But essentially the idea there is that this is a good representative of the the content from your website and that’s all that we would show to users on the other hand if someone is specifically looking for let’s say dental bridges in Dublin then we’d be able to show the appropriate clinic that you have on your website that matches that a little bit better so we’d know dental bridges is something that you have a lot on your website and Dublin is something that’s unique to this specific page so we’d be able to pull that out and to show that to the user like that so from a pure content duplication point of view that’s not really something I totally worry about.
I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.”
The takeaway here is that if you have LOTS of location pages serving ONE SINGLE business in one location, then those are very probably classed as some sort of doorway pages, and probably old-school SEO techniques for these type of pages will see them classed as lower-quality – or even – spammy pages.
Google has long warned webmasters about using Doorway pages but many sites still employ them, because, either:
- their business model depends on it for lead generation
- the alternative is either a lot of work or
- they are not creative enough or
- they are not experienced enough to avoid the pitfalls of having lower-quality doorway pages on a site or
- they are experienced enough to understand what impact they might be having on a site quality score
Google has a doorway page algorithm which no doubt they constantly improve upon. Google warned:
QUOTE: “Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.” Google 2015
If you have location pages that serve multiple locations or businesses, then those are not doorway pages and should be improved uniquely to rank better, according to John’s advice.
Are You Making Doorway Pages?
Search Engine Land offered search engine optimisers this clarification from Google:
QUOTE: “How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions:
- Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
- Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
- Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
- Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
- Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?”
Barry Schwartz at Seroundtable picked this up:
QUOTE: “Well, a doorway page would be if you have a large collection of pages where you’re just like tweaking the keywords on those pages for that.
I think if you focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword then that’s that’s usually something that leads to a reasonable result.
Whereas if you’re just taking a list of keywords and saying I need to make pages for each of these keywords and each of the permutations that might be for like two or three of those keywords then that’s just creating pages for the sake of keywords which is essentially what we look at as a doorway.”
Note I underlined the following statement:
QUOTE: “focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword.”
That is because sometimes, often, in fact, there is an alternative to doorway pages for location pages that achieve essentially the same thing for webmasters.
Naturally, business owners want to rank for lots of keywords in organic listings with their website. The challenge for webmasters and SEO is that Google doesn’t want business owners to rank for lots of keywords using autogenerated content especially when that produces A LOT of pages on a website using (for instance) a list of keyword variations page-to-page.
QUOTE: “7.4.3 Automatically Generated Main Content Entire websites may be created by designing a basic template from which hundreds or thousands of pages are created, sometimes using content from freely available sources (such as an RSS feed or API). These pages are created with no or very little time, effort, or expertise, and also have no editing or manual curation. Pages and websites made up of autogenerated content with no editing or manual curation, and no original content or value added for users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017
The end-result is webmasters create doorway pages without even properly understanding what they represent to Google and without realising Google will not index all these autogenerated pages.
WIKIPEDIA says of doorway pages:
QUOTE: “Doorway pages are web pages that are created for spamdexing. This is for spamming the index of a search engine by inserting results for particular phrases with the purpose of sending visitors to a different page.”
“Spamdexing, which is a word derived from “spam” and “indexing,” refers to the practice of search engine spamming. It is a form of SEO spamming.”
“Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
Here are some examples of doorways:
- Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
- Pages generated to funnel visitors into the actual usable or relevant portion of your site(s)
- Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy”
“Doorways are sites or pages created to rank highly for specific search queries”
Take note: It is not just location pages that are classed as doorway pages:
QUOTE: “For Google, that’s probably overdoing it and ends up in a situation you basically create a doorway site …. with pages of low value…. that target one specific query.” John Mueller 2018
If your website is made up of lower-quality doorway type pages using old SEO-techniques (which more and more labelled as spam in 2018) then Google will not index all of the pages and your website ‘quality score’ is probably going to be negatively impacted.
If you are making keyword rich location pages for a single business website, there’s a risk these pages will be classed doorway pages in 2018.
If you know you have VERY low-quality doorway pages on your site, you should remove them or rethink your SEO strategy if you want to rank high in Google for the long term.
Location-based pages are suitable for some kind of websites, and not others.
What Makes A Page Spam?
What makes a page spam?:
- Keyword stuffing – no percentage or keyword density given; this is up to the rater
- PPC ads that only serve to make money, not help users
- Copied/scraped content and PPC ads
- Feeds with PPC ads
- Doorway pages – multiple landing pages that all direct user to the same destination
- Templates and other computer-generated pages mass-produced, marked by copied content and/or slight keyword variations
- Copied message boards with no other page content
- Fake search pages with PPC ads
- Fake blogs with PPC ads, identified by copied/scraped or nonsensical spun content
- Thin affiliate sites that only exist to make money, identified by checkout on a different domain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question
- Pure PPC pages with little to no content
- Parked domains
There’s more on this announcement at SEW.
If A Page Exists Only To Make Money, The Page Is Spam, to Google
QUOTE: “If A Page Exists Only To Make Money, The Page Is Spam” GOOGLE
That statement above in the original quality rater guidelines is standout and should be a heads up to any webmaster out there who thinks they are going to make a “fast buck” from Google organic listings in 2018.
It should, at least, make you think about the types of pages you are going to spend your valuable time making.
Without VALUE ADD for Google’s users – don’t expect to rank high for commercial keywords.
If you are making a page today with the sole purpose of making money from it – and especially with free traffic from Google – you obviously didn’t get the memo.
Consider this statement from a manual reviewer:
QUOTE: “…when they DO get to the top, they have to be reviewed with a human eye in order to make sure the site has quality.” potpiegirl
It’s worth remembering:
- If A Page Exists Only To Make Money, The Page Is Spam
- If A Site Exists Only To Make Money, The Site Is Spam
This is how what you make will be judged – whether it is fair or not.
IS IT ALL BAD NEWS?
Of course not and in some cases, it levels the playing field especially if you are willing to:
- Differentiate yourself
- Be Remarkable
- Be accessible
- Add unique content to your site
- Help users in an original way
Google doesn’t care about search engine optimizers or the vast majority of websites but the search engine giant DOES care about HELPING ITS OWN USERS.
So, if you are helping visitors the come from Google – and not by just directing them to another website – you are probably doing one thing right at least.
With this in mind – I am already building affiliate sites differently, for instance.
How To Improve Your Website Content
This is no longer about repeating keywords. ANYTHING you do to IMPROVE the page is going to be a potential SEO benefit. That could be:
- creating fresh content
- removing doorway-type pages
- cleaning up or removing thin-content on a site
- adding relevant keywords and key phrases to relevant pages
- constantly improving pages to keep them relevant
- fixing poor grammar and spelling mistakes
- adding synonyms and related key phrases to text
- reducing keyword stuffing
- reducing the ratio of duplicated text on your page to unique text
- removing old outdated links or out-of-date content
- rewording sentences to take out sales or marketing fluff and focusing more on the USER INTENT (e.g. give them the facts first including pros and cons – for instance – through reviews) and purpose of the page.
- merging many old stale pages into one, fresh page, which is updated periodically to keep it relevant
- Conciseness, while still maximising relevance and keyword coverage
- Improving important keyword phrase prominence throughout your page copy (you can have too much, or too little, and it is going to take testing to find out what is the optimal presentation will be)
- Topic modelling
A great writer can get away with fluff but the rest of us probably should focus on being concise.
Low-quality fluff is easily discounted by Google these days – and can leave a toxic footprint on a website.
How To Get Featured Snippets on Google
QUOTE: “When a user asks a question in Google Search, we might show a search result in a special featured snippet block at the top of the search results page. This featured snippet block includes a summary of the answer, extracted from a webpage, plus a link to the page, the page title and URL” Google 2018
Any content strategy in 2018 should naturally be focused on creating high-quality content and also revolve around triggering Google FEATURED SNIPPETS that trigger when Google wants them to – and intermittently – depending on the nature of the query.
Regarding the above image, where a page on Hobo is promoted to number 1 – I used traditional competitor keyword research and old-school keyword analysis and keyword phrase selection, albeit focused on the opportunity in long-form content, to accomplish that, proving that you still use this keyword research experience to rank a page in 2018.
Despite all the obfuscation, time delay, keyword rewriting, manual rating and selection bias Google goes through to match pages to keyword queries to, you still need to optimise a page to rank in a niche, and if you do it sensibly, you unlock a wealth of long-tail traffic over time (a lot of which is useless as it always was, but what RankBrain might clean up given time).
- Google is only going to produce more of these direct answers or answer boxes in future (they have been moving in this direction since 2005).
- Focusing on triggering these will focus your content creators on creating exactly the type of pages Google wants to rank. “HOW TO” guides and “WHAT IS” guides is IDEAL and the VERY BEST type of content for this exercise.
- Google is REALLY rewarding these articles in 2018 – and the search engine is VERY probably going to keep doing so for the future.
- Google Knowledge Graph offers another exciting opportunity – and indicates the next stage in organic search.
- Google is producing these ANSWER BOXES that can promote a page from anywhere on the front page of Google to number 1.
- All in-depth content strategy on your site should be focused on this new aspect of Google Optimisation. The bonus is you physically create content that Google is ranking very well in 2018 even without taking knowledge boxes into consideration.
- Basically – you are feeding Google EASY ANSWERS to scrape from your page. This all ties together very nicely with organic link building. The MORE ANSWER BOXES you UNLOCK – the more chance you have of ranking number one FOR MORE AND MORE TERMS – and as a result – more and more people see your utilitarian content and as a result – you get social shares and links if people care at all about it.
- You can share an Enhanced Snippet (or Google Answer Box as they were first called by SEOs). Sometimes you are featured and sometimes it is a competitor URL. All you can do in this case is to continue to improve the page until you squeeze your competitor out.
We already know that Google likes ‘tips’ and “how to” and expanded FAQ but this Knowledge Graph ANSWER BOX system provides a real opportunity and is CERTAINLY what any content strategy should be focused around to maximise exposure of your business in organic searches.
Unfortunately, this is a double-edged sword if you take a long-term view. Google is, after all, looking for easy answers so, eventually, it might not need to send visitors to your page.
To be fair, these Google Enhanced Snippets, at the moment, appear complete with a reference link to your page and can positively impact traffic to the page. SO – for the moment – it’s an opportunity to take advantage of.
How To Deal With Low-Quality Content On Your Site
Clean it up.
The first thing to do is always look to your site for content that is under-performing. Low-quality, out-dated or otherwise ‘stale’ content pages should be reworked, redirected or removed. This is normally the first thing we aim to do in any project we are employed in.
Often, content can be repackaged into a more compelling, user-focused topic based article that explores a topic and frequently asked questions about it. This topic page can be optimised with synonyms and related keywords, entities and concepts.
Ranking in Google is not about repeating keywords anymore – in fact – all efforts down that road will only lead to failure.
Ranking high in Google is about exploring a ‘CONCEPT’ on a web page in a way that HELPS INFORM users while still meeting Google’s keyword-based and entity based relevancy signals.
Specific Advice From Google on Pruning Content From Your Site
If you have very low-quality site form a content point of view, just deleting the content (or noindexing it) is probably not going to have a massive positive impact on your rankings.
Ultimately the recommendation in 2018 is to focus on “improving content” as “you have the potential to go further down if you remove that content”.
QUOTE: “Ultimately, you just want to have a really great site people love. I know it sounds like a cliché, but almost [all of] what we are looking for is surely what users are looking for. A site with content that users love – let’s say they interact with content in some way – that will help you in ranking in general, not with Panda. Pruning is not a good idea because with Panda, I don’t think it will ever help mainly because you are very likely to get Panda penalized – Pandalized – because of low-quality content…content that’s actually ranking shouldn’t perhaps rank that well. Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started. I don’t think you are removing content from the site with potential to rank – you have the potential to go further down if you remove that content. I would spend resources on improving content, or, if you don’t have the means to save that content, just leave it there. Ultimately people want good sites. They don’t want empty pages and crappy content. Ultimately that’s your goal – it’s created for your users.” Gary Illyes, Google 2017
Specific Advice From Google On Low-Quality Content On Your Site
And remember the following, specific advice from Google on removing low-quality content from a domain:
******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******
They are not lying. Here is another example of taking multiple pages and making one better high-quality page in place of them:
Focus on Quality To Improve Conversion Rates
However you are trying to satisfy users, many think this is about terminating searches via your site or on your site or satisfying the long-click.
How you do that in an ethical manner (e.g. not breaking the back button on browsers) the main aim is to satisfy that user somehow.
You used to rank by being a virtual PageRank black hole. Now, you need to think about being a User black hole.
You want a user to click your result in Google, and not need to go back to Google to do the same search that ends with the user pogo-sticking to another result, apparently unsatisfied with your page.
The aim is to convert users into subscribers, returning visitors, sharing partners, paying customers or even just help them along on their way to learn something.
The success I have had in ranking pages and getting more traffic have largely revolved around optimising the technical framework of a site, crawl and indexing efficiency, removal of outdated content, content re-shaping, constant improvement of text content to meet its purpose better, internal links to relevant content, conversion optimisation or getting users to ‘stick around’ – or at least visit where I recommend they visit.
Mostly – I’ve focused on satisfying user intent because Google isn’t going back with that.
Improve Your Content In A Way That Helps Your Page Achieve its ‘Purpose’ Better
The PURPOSE of A Page Is Extremely Important To Google Quality Raters.
Is the purpose of a page to “sell products or services”, “to entertain” or “ to share information about a topic.”
MAKE THE PURPOSE OF YOUR PAGE SINGULAR and OBVIOUS to help quality raters and algorithms.
The name of the game in 2018 (if you’re not faking everything) is VISITOR SATISFACTION.
If a Google visitor lands on your page – have you satisfied WHY they are there?
How To Optimise Content For The “Long Click“
Content quality may influence Google rankings in more than one area, for instance:
Ranking could be based on a ‘duration metric’:
QUOTE: “The average duration metric for the particular group of resources can be a statistical measure computed from a data set of measurements of a length of time that elapses between a time that a given user clicks on a search result included in a search results web page that identifies a resource in the particular group of resources and a time that the given user navigates back to the search results web page. …Thus, the user experience can be improved because search results higher in the presentation order will better match the user’s informational needs.” High Quality Search Results based on Repeat Clicks and Visit Duration
Rankings could be based on a ‘duration performance score‘:
QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations
When it comes to rating user satisfaction, there are a few theories doing the rounds at the moment that I think is sensible. Google could be tracking user satisfaction by proxy. When a user uses Google to search for something, user behaviour from that point on can be a proxy for the relevance and relative quality of the actual SERP.
This was recently leaked from Google itself:
“So when search was invented, like when Google was invented many years ago, they wrote heuristics that had figure out what the relationship between a search and the best page for that search was. And those heuristics worked pretty well and continue to work pretty well.”
“But Google is now integrating machine learning into that process. So then training models on when someone clicks on a page and stays on that page, when they go back or when they are trying to figure out exactly on that relationship. So search is getting better and better and better because of advances in machine learning.” Head Google Brain Toronto/Canada via SERoundtable
What is a Long Click?
A user clicks a result and spends time on it, sometimes terminating the search.
What is a Short Click?
A user clicks a result and bounces back to the SERP, pogo-sticking between other results until a long click is observed. Google has this information if it wants to use it as a proxy for query satisfaction.
For more on this, I recommend this article on the time to long click.
Optimise Supplementary Content on the Page
Once you have high-quality main content on the page, you need to think about supplementary content and secondary links that help users on their journey of discovery or other website conversion goal.
That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic – you should be LINKING OUT to other helpful resources e.g. other websites.A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.
A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.
- TASK – On informational pages, LINK OUT to related pages on other sites AND on other pages on your own website where RELEVANT
- TASK – For e-commerce pages, ADD RELATED PRODUCTS.
- TASK – Create In-depth Content Pieces
- TASK – Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links
- TASK – Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject
- TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
UGC (“User-Generated Content”)
User-Generated Content Is Rated As Part of the Page in Quality Scoring
QUOTE: “when we look at a page, overall, or when a user looks at a page, we see these comments as part of the content design as well. So it is something that kind of all combines to something that users see, that our algorithms see overall as part of the content that you’re publishing.” John Mueller, Google
Bear in mind that:
QUOTE: “As a publisher, you are responsible for ensuring that all user-generated content on your site or app complies with all applicable programme policies.” Google, 2018
Google want user-generated content on your site to be moderated and kept as high quality as the rest of your site.
QUOTE: “Since spammy user-generated content can pollute Google search results, we recommend you actively monitor and remove this type of spam from your site.” Google, 2018
QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”
QUOTE: “When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google
TIP: Moderate Comments
Some examples of spammy user-generated content include:
QUOTE: “Comment spam on blogs” Google, 2018
User-generated content (for instance blog comments) are counted as part of the page and these comments are taken into consideration when Google rates the page.
QUOTE: “For this reason there are many ways of securing your application and dis-incentivising spammers.
- Disallow anonymous posting.
- Use CAPTCHAs and other methods to prevent automated comment spamming.
- Turn on comment moderation.
- Use the “nofollow” attribute for links in the comment field.
- Disallow hyperlinks in comments.
- Block comment pages using robots.txt or meta tags.” Google
TIP – Moderate Forums
QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
If you have a forum plugin on your site, moderate:
QUOTE: “Spammy posts on forum threads” Google, 2018
It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your site.
In a 2015 hangout John Mueller said to “noindex untrusted post content” and going on says “posts by new posters who haven’t been in the forum before. threads that don’t have any answers. Maybe they’re noindexed by default.“
A very interesting statement was “how much quality content do you have compared to low-quality content“. That indicates Google is looking at this ratio. John says to identify “which pages are high-quality, which pages are lower quality so that the pages that do get indexed are really the high-quality ones.“
John mentions looking at “threads that don’t have any authoritative answers“.
I think that advice is relevant for any site with lots of content.
TIP: Moderate ANY User Generated Content On Your Site
You are responsible for what you publish.
No matter how you let others post something on your website, you must ensure a high standard:
QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules). When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google 2016
QUOTE: “… it also includes things like the comments, includes the things like the unique and original content that you’re putting out on your site that is being added through user-generated content, all of that as well. So while I don’t really know exactly what our algorithms are looking at specifically with regards to your website, it’s something where sometimes you go through the articles and say well there is some useful information in this article that you’re sharing here, but there’s just lots of other stuff happening on the bottom of these blog posts. When our algorithms look at these pages, in an aggregated way across the whole page, then that’s something where they might say well, this is a lot of content that is unique to this page, but it’s not really high quality content that we want to promote in a very visible way. That’s something where I could imagine that maybe there’s something you could do, otherwise it’s really tricky I guess to look at specific changes you can do when it comes to our quality algorithms.” John Mueller, Google 2016
QUOTE: “Well, I think you need to look at the pages in an overall way, you should look at the pages and say, actually we see this a lot in the forums for example, people will say “my text is unique, you can copy and paste it and it’s unique to my website.” But that doesn’t make this website page a high quality page. So things like the overall design, how it comes across, how it looks like an authority, this information that is in general to webpage, to website, that’s things that all come together. But also things like comments where webmasters might say “this is user generated content, I’m not responsible for what people are posting on my website,” John Mueller, Google 2016
QUOTE: “If you have comments on your site, and you just let them run wild, you don’t moderate them, they’re filled with spammers or with people who are kind of just abusing each other for no good reason, then that’s something that might kind of pull down the overall quality of your website where users when they go to those pages might say, well, there’s some good content on top here, but this whole bottom part of the page, this is really trash. I don’t want to be involved with the website that actively encourages this kind of behavior or that actively promotes this kind of content. And that’s something where we might see that on a site level, as well.” John Mueller, Google 2016
QUOTE: “When our quality algorithms go to your website, and they see that there’s some good content here on this page, but there’s some really bad or kind of low quality content on the bottom part of the page, then we kind of have to make a judgment call on these pages themselves and say, well, some good, some bad. Is this overwhelmingly bad? Is this overwhelmingly good? Where do we draw the line?” John Mueller, Google 2016
There’s a key insight for many webmasters about modern SEO in understanding managing UGC.
Watch out for those who want to use your blog for financial purposes both and terms of adding content to the site and linking to other sites.
QUOTE: “Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.” John Mueller, Google 2013
TIP: You Must Moderate User-Generated Content If You Display Google Adsense:
QUOTE: “Consider where user-generated content might appear on your site or app, and what risks to your site or app’s reputation might occur from malicious user-generated content. Ensure that you mitigate those risks before enabling user-generated content to appear.Set aside some time to regularly review your top pages with user-generated content. Make sure that what you see complies with all our programme policies.” Google Adsense Policies, 2018
QUOTE: “If a post hasn’t been reviewed yet and approved, allow it to appear, but disable ad serving on that page. Only enable ad serving when you’re sure that a post complies with our programme policies.” Google Adsense Policies, 2018
QUOTE: “And we do that across the whole website to kind of figure out where we see the quality of this website. And that’s something that could definitely be affecting your website overall in the search results. So if you really work to make sure that these comments are really high quality content, that they bring value, engagement into your pages, then that’s fantastic. That’s something that I think you should definitely make it so that search engines can pick that up on.” John Mueller, Google 2016
You can also get a Google manual action penalty for user-generated spam on your site, which can affect parts of the site or the whole site.
QUOTE: “Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles. As a result, Google has applied a manual spam action to your site.” Google Notice of Manual Action
Should I Use Nofollow On User-Generated Content Links?
QUOTE: “The best-known use for nofollow is blog comment spam, but the mechanism is completely general. Nofollow is recommended anywhere that links can’t be vouched for. ” Matt Cutts, Google 2006
I was one of those guys spamming comments way back in the day, although I tried to be sensible about it (on the whole). I even let my own blog get spammed for a long time. Sometimes that was me as well!
QUOTE: “If an off-domain link is made by an anonymous or unauthenticated user, I’d use nofollow on that link. Once a user has done a certain number of posts/edits, or has been around for long enough to build up trust, then those nofollows could be removed and the links could be trusted. Anytime you have a user that you’d trust, there’s no need to use nofollow links.” Matt Cutts, Google 2006
If you have a commenting system (like Drupal, Joomla or WordPress) that allows for search engine friendly links (commonly called dofollow links) from your blog or site, you will probably, eventually be the target of lots of spam, be complicated in tiered link schemes and potentially fall foul of Google’s webmaster guidelines on using the attribute in certain situations.
QUOTE: “”Nofollow” provides a way for Webmasters to tell search engines “Don’t follow links on this page” or “Don’t follow this specific link.“”
You don’t need only to stick to one topic area on a website. That is a myth.
If you create high-quality pieces of informative content on your website page-to-page, you will rank.
The problem is – not many people are polymaths – and this will be reflected in blog posts that end up too thin to satisfy users and in time, Google, or e-commerce sites that sell everything and have speciality and experience in little of it.
The only focus with any certainty in 2018 is whatever you do, stay high-quality with content, and avoid creating doorway pages.
For some sites, that will mean reducing pages on many topics to a few that can be focused on so that you can start to build authority in that subject area.
Your website is an entity. You are an entity. Explore concepts. Don’t repeat stuff. Be succinct.
You are what keywords are on your pages.
You rank as a result of others rating your writing.
Avoid toxic visitors. A page must meet its purpose well, without manipulation. Do people stay and interact with your page or do they go back to Google and click on other results? A page should be explicit in its purpose and focus on the user.
The number 1 ‘user experience’ signal you can manipulate with low risk is improving content until it is more useful or better presented than is found on competing pages for variously related keyword phrases.