SEO Tutorial > SEO Management Basics > A Beginners Guide

Blog subscribers

Disclosure: “This article is personal opinion of research based on my experience of 20 years. There is no third party advertising on this page or monetised links of any kind. Any external links to third party sites are moderated by me. Disclaimer.” Shaun Anderson, Hobo

  1. Only add high-quality 100% unique content to your site
  2. Create pages where the main purpose of the page is given priority
  3. Avoid annoying ads and pop-ups (especially on mobile)
  4. Register your website with Google Search Console
  5. Optimise your website core web vitals
  6. Have a responsive design
  7. Have a mobile theme that downloads under 3 seconds
  8. Don’t block search engines crawling your site in robots.txt or metatags
  9. Don’t confuse or annoy a website visitor
  10. Don’t block Google from crawling resources on your site or rendering specific elements on your page
  11. Optimise for customers local to your business, if that is important
  12. Geotarget your site in Search Console (you do not need to do this you have a country-specific domain)
  13. Do not use copied content for pages (or even part of a page) – pages should stand on their own
  14. Use a simple navigation system on your site
  15. Develop websites that meet Google technical recommendations on (for example) canonicalisation, internationalisation and pagination best practices
  16. Ensure on average all ‘Main Content’ blocks of all pages on the site are high-quality
  17. Ensure old-school practices are cleaned up and removed from site
  18. Avoid implementing old-school practices in new campaigns (Google is better at detecting sites with little value-add)
  19. Consider disavowing any obvious low-quality links from previous link building activity
  20. Provide Clear website domain ownership, copyright and contact details on the site
  21. Share your content on the major social networks when it is good enough
  22. Get backlinks from real websites with real domain trust and authority
  23. Do not build unnatural links to your site
  24. Convert visitors (whatever that ‘conversion’ is)
  25. Monitor VERY CAREFULLY any user-generated content on your site, because it is rated as part of your own site content
  26. Pay attention to site security issues (implement secure https, for example)
  27. Beware evaporating efforts (AVOID old-school practices designed to manipulate rankings and REMOVE them when identified)
  28. Ensure erroneous conversion rate optimisation (CRO) practices are not negatively impacting organic traffic levels (don’t screw this part up!)
  29. Aim for a good ratio of ‘useful’ user-centered text to affiliate links. Be careful using affiliate links at all!
  30. Make visitors from Google happy
  31. On-Page, include your keyword phrase at least once in the Page Title Element
  32. On-Page, keep page titles succinct and avoid adding irrelevant keywords
  33. On-Page, include your keyword phrase at least once in the Main Content on the page (at least once in page copy (in Paragraph tags)
  34. On-Page, avoid keyword stuffing main content or any specific html element or attribute
  35. On-Page, optimise your meta description to have a clickable useful SERP snippet
  36. On-Page, ensure the Main Content of the page is high-quality and written by a professional (MOST OF YOUR EFFORT GOES HERE – If your content is not being shared organically, you may have a content quality problem)
  37. On-Page, Ensure the keywords you want to rank for are present on your page. The quality of competition for these rankings will determine how much effort you need to put in
  38. On-Page, use synonyms and common co-occurring words throughout your page copy but write naturally
  39. On-Page, add value to pages with ordered lists, images, videos and tables
  40. On-Page, optimise for increased ‘user intent’ satisfaction (e.g. increased dwell times on a page or site)
  41. On-Page, keep important content on the site updated a few times a year
  42. On-Page, trim outdated content
  43. On-Page, avoid publishing and indexing content-poor pages (especially affiliate sites)
  44. On-Page, disclose page modification dates in a visible format
  45. On-Page, do not push the main content down a page unnecessarily with ads or obtrusive CTA for even your own business
  46. On-Page, link to related content on your site with useful and very relevant anchor text
  47. On-Page, avoid keyword stuffing internal anchor text.
  48. On-Page, create pages to basic meet W3C recommendations on accessible HTML (W3c) (H1, ALT text etc)
  49. On-Page, create pages to meet basic usability best practices (Nielsen) – Pay attention to what ‘annoys’ website visitors
  50. On-Page, ensure fast delivery of web pages on mobile and desktop
  51. On-Page, provide clear disclosure of affiliate ads and non-intrusive advertising. Clear disclosure of everything, in fact, if you are focused on quality in all areas.
  52. On-Page, add high-quality and relevant external links (depending if the query is informational)
  53. On-Page, if you can, include the Keyword phrase in a short URL
  54. On-Page, use the Keyword phrase in internal anchor text pointing to this page (at least once)
  55. On-Page, use Headings, Lists and HTML Tables on pages to display appropriate data

… but, yes, it does get more complicated.

Table of Contents

What is SEO?

Search engine optimisation (SEO) is a technical, analytical and creative process to improve the visibility of a website in search engines.  In simple terms this is all about getting free traffic from Google, the most popular search engine in the world.

How To Learn SEO

The tips I present on this page will help you create a successful search engine-friendly website yourself:

I have 20 years of experience as a professional SEO (search engine optimiser).

This tutorial is a collection of the tips and best practices I use (and have used for years) to rank websites in Google (and Bing).

QUOTE: “Re: Hobo Web: This is by far the most complete free SEO tutorial for beginners I can find online.” Mathew Woodward, SEO Blogger, 2020

If you find it of use, please share it.

The best way to learn is to practice it on a real website. You keep up with the latest industry news and follow search engine webmaster guidelines. You edit a website that ranks in search engines and watch how search engines respond to your changes. You monitor organic search engine traffic. Track rankings for individual keywords and pages. You do lots of tests.

QUOTE: “There aren’t any quick magical tricks…. so that your site ranks number one. It’s important to note that any …. potential is only as high as the quality of your business or website so successful SEO helps your website put your best foot forward.” Maile Ohye, Google 2017

What really matters is what you prioritise today so that in 3-6 months you can see improvements in the quality of your organic traffic, as we:

QUOTE: “will need four months to a year to help your business first implement improvements and then see potential benefit.” Maile Ohye, Google 2017

You will need to meet Google’s guidelines and recommendations in every area (and, if you are like me with this site, you eventually avoid bending any rule and just focus on serving the user useful and up-to-date content).

QUOTE: “I don’t think there’s one magic trick that that I can offer you that will make sure that your website stays relevant in the ever-changing world of the web so that’s something where you’ll kind of have to monitor that on your side and work out what makes sense for your site or your users or your business.” John Mueller, Google 2019

Introduction

Search engine optimisation is:

QUOTE: “often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but…. they could have a noticeable impact on your site’s user experience and performance in organic search results.” Google Starter Guide, 2020

This article is a beginner’s guide.

I deliberately steer clear of techniques that might be ‘grey hat’, as what is grey today is often ‘black hat’ tomorrow, or ‘shady practices’, as far as Google is concerned.

QUOTE: “Shady practices on your website […] result in a reduction in search rankings” Maile Ohye, Google 2017

No one-page guide can explore this complex topic in full. What you’ll read here are answers to questions I had when I was starting out in this field 20 years ago, now ‘corroborated with confirmations from Google.

QUOTE: “My strongest advice ….. is to request if they corroborate their recommendation with a documented statement from Google” Maile Ohye, Google 2017

You will find I do.

The ‘Rules.’

Google insists webmasters adhere to their ‘rules’ and aims to reward sites with high-quality content and remarkable ‘white hat’ web marketing techniques with high rankings.

QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors discussed here.” Google SEO Starter Guide, 2020

Conversely, it also needs to filter or penalise websites that manage to rank in Google by breaking these rules.

QUOTE: “Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.” Gary Illyes, Google 2016

These rules are not ‘laws’, but ‘guidelines’, for ranking in Google; lay down by Google. You should note, however, that some methods of ranking in Google are, in fact, illegal. Hacking, for instance, is illegal in many countries.

You can choose to follow and abide by these rules, bend them or ignore them – all with different levels of success (and levels of retribution, from Google’s web spam team).

White hats do it by the ‘rules’; black hats ignore the ‘rules’.

What you read in this article is perfectly within the laws and also within the guidelines and will help you increase the traffic to your website through organic, or natural search engine results pages (SERPs).

Opportunity

QUOTE: “It is the process of getting traffic from the “free,” “organic,” “editorial” or “natural” search results on search engines.” Search Engine Land, 2020

A professional, whether they practice it in India, Asia, the Middle East or Europe has an understanding of how search engine users search for things and an understanding what type of results Google wants to (or will) display to its users and under which conditions.

An SEO is:

QUOTE: “someone trained to improve your visibility on search engines.” Google Webmaster Guidelines, 2020

A professional has an understanding of how search engines like Google generate their natural SERPs to satisfy users’ navigationalinformational and transactional keyword queries.

QUOTE: “One piece of advice I tend to give people is to aim for a niche within your niche where you can be the best by a long stretch. Find something where people explicitly seek YOU out, not just “cheap X” (where even if you rank, chances are they’ll click around to other sites anyway).” John Mueller, Google 2018

Risk Management

QUOTE: “Google doesn’t accept payment to crawl a site more frequently, or rank it higher. If anyone tells you otherwise, they’re wrong.” Google Webmaster Guidelines, 2020

A good search engine marketer has a good understanding of the short term and long term risks involved in optimising rankings in search engines, and an understanding of the type of content and sites Google (especially) WANTS to return in its natural SERPs.

The aim of any campaign is more visibility in search engines and this would be a simple process if it were not for the many pitfalls.

There are rules to be followed or ignored, risks to take, gains to make, and battles to be won or lost.

Free Traffic

QUOTE: “Google is “the biggest kingmaker on this Earth.” Amit Singhal, Google, 2010

A Mountain View spokesman once called the search engine ‘kingmakers‘, and that’s no lie.

Ranking high in Google is VERY VALUABLE – it’s effectively ‘free advertising’ on the best advertising space in the world.

Traffic from Google natural listings is STILL the most valuable organic traffic to a website in the world, and it can make or break an online business.

The state of play is that you can STILL generate highly targeted leads, for FREE, just by improving your website and optimising your content to be as relevant as possible for a buyer looking for what you do.

As you can imagine, there’s a LOT of competition now for that free traffic – even from Google (!) in some niches.

You shouldn’t compete with Google.

You should focus on competing with your competitors.

The Process

Google supports a technical approach and supports us when we improve the web for users, and focus on:

QUOTE: “site content or structure – Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript – Content development – Management of online business development campaigns – Keyword research – …training – Expertise in specific markets and geographies.” Google Webmaster Guidelines, 2020

The process can be practiced, successfully, in a bedroom or a workplace, but it has traditionally always involved mastering many skills from website development to copywriting.

It takes a lot to rank on merit a page in Google in competitive niches, due to the amount of competition for those top spots.

Pagerank

  • Google still uses Pagerank
  • Pagerank is one of over hundreds of signals Google uses to rank web pages.
  • Google does not tell us Pagerank scores (and haven’t done so since 2013)
  • Google probably uses an updated version of Pagerank than the original patent (one they own outright)
  • Google probably uses the Reasonable Surfer model of Pagerank
  • A prominent link on a page probably passes more value than other links on the page
  • Pages with higher Pagerank will not necessarily rank higher than other pages with a lower Pagerank
  • Pages with a high Pagerank probably get crawled more often
  • Pages with a low Pagerank probably are crawled a lot less
  • You should not focus on Pagerank before building a high quality website

QUOTE: “The simple, most practical answer to increasing Google Pagerank is to get prominent, natural links on authoritative websites in your niche.” Shaun Anderson, Hobo 2020

Evaporating SEO Efforts

Google ignores lots of your lower-quality efforts, and as Google evolves and makes very slight changes to its algorithms to further protect its systems, everybody’s rankings fluctuate.

When you spam a page or element of a page, you can expect the benefit you received from it to be at some point negated (if not penalised).

There are many examples of this… for instance, with Keyword stuffing:

QUOTE: “Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started.” Gary Illyes, Google 2017

and low-quality link building:

QUOTE: “(Penguin) doesn’t demote it will just discard the incoming spam toward the side and it will just ignore the spam and that’s it no penalty no demotion and it works in real time” Gary Illyes, Google 2016

and when we started manipulating pagerank sculpting, Google changed the way Pagerank flowed:

QUOTE: “What happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each.” Matt Cutts, Google 2009

I also know of a few evaporating black spots in on page elements, too.

For instance, in a recent test if you have a page title tag longer than 12 words all the keywords beyond 12 words carried less weight than when they were on the visible page text.

Image ALT text too has a blackspot. Add more than 16 words to ALT text and the remaining words… evaporate. These keywords don’t count, at all.

Anchor text too, has a limit of 16 words and a black spot for keywords contained beyond the 16th word.

Google has limits. Expect them to change, or be modified.

The more you spam elements of your page or link profile, the more Google’s algorithms go to work to “disregard the advantage you figure out, so you fall back where you started“. The more you take optimisation to the nth degree, the more susceptible you are to algorithm changes when things change.

If you are consistently and successfully abusing the system, and are caught, Google algorithms will flag your site for manual review, and a potential penalty called a “manual action”, where you will need to clean up the offending issues before you can rank high again.

But mostly, Google prefers to just ignore and devalue your low-quality efforts, as long as they are not too malicious.

You do not want to “chase the Google algorithm”.

Focus on making something useful that attracts high quality links to it instead.

This is pure speculation, but:

QUOTE: The manual actions team… can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labelled, then they might actually take a deeper look and see what the content owner is trying to do.” Gary Illyes, Google 2016

This quote reminded me of when back in 2011 on a blackhat forum the GSA (Google Search Appliance) was discussed, someone got their hands on one and some attempted to hack Google’s code.

What they reported back was…

QUOTE:  “tons of link labels”

e.g.

_LinkTags_NAMES = {
    0: "PAGE_GUESTBOOK",
    16: "PAGE_FORUM",
    2: "PAGE_MAILING_LIST",
    3: "PAGE_BLOG_COMMENTS",
    4: "PAGE_BLOG",
    5: "PAGE_PPC",
    19: "PAGE_SPAM_SIGNATURE_KILL",
    21: "PAGE_SPAM_SIGNATURE_NOPROP",
    7: "PAGE_AFFILIATE",

It makes sense that this process of labeling is how you create a Search Engine Results Page out of pages Pagerank (or something akin to it) identifies, identify spam, identify monetisation trends and promote content first pages and user friendly content above others. You can also imagine that over time, Google should get a lot better at working out quality SERPs for its users, as it identifies more and more NEGATIVE ranking signals, thereby floating higher quality pages to the top as a second order effect. A end-result of this could be that Google gets an amazing SERP for its users.

Google might have labels for EVERYTHING from low quality, spam, intent, site speed, ad placement, ymyl page, information page etc and as they build out scoring principles and sub-labels for each label.

In one example, they might decide not to show any affiliate sites on page 1, for instance, or only 1.

All the above is speculation, of course.

What can be agreed upon is that we do not want a LOW QUALITY label on our pages OR links, either appointed by algorithm OR manual rater.

At least one former spam fighter that would confirm that at least some people are employed at Google to demote sites that fail to meet policy:

QUOTE: “I didn’t SEO at all, when I was at Google. I wasn’t trying to make a site much better but i was trying to find sites that were not ‘implementing Google policies'(?*) and not giving the best user experience.” Murat Yatağan, Former Google Webspam team, 2016

I think I see more of Google pulling pages and sites down the rankings (because of policy violation) than promoting them because of discovered ‘quality’. Perhaps the human quality raters are there to highlight quality. We don’t know what Google does with all this information.

I proceed thinking that in Google’s world, a site that avoids punishment algorithms, has verified independent links and has content favoured by users over time (which they are tracking) is a ‘quality page’ Google will rank highly.

So, for the long term, on primary sites, once you have cleaned all infractions up, the aim is to satisfy users by:

  • getting the click
  • keeping people on your site by delivering on purpose and long enough for them to terminate their search (without any techniques that would be frowned upon or contravene Google recommendations)
  • convert visitors to at least search terminators, returning visitors or actual sales

What Is A Successful Strategy?

Get relevant. Get trusted. Get Popular. Help a visitor complete their task. Do not annoy users. Do not put CRO before a user’s enjoyment of content e.g. do not interrupt MC  (Main Content) of a page with ADs (Adverts) or CTA (Adverts for your own business).

It is no longer just about manipulation.

Success comes from adding quality and often useful content to your website that together meet a PURPOSE that delivers USER SATISFACTION over the longer term.

If you are serious about getting more free traffic from search engines, get ready to invest time and effort in your website and online marketing.

Be ready to put Google’s users, and yours, FIRST, before Conversion, especially on information-type pages (articles, blog posts etc).

Content First Strategy

In short, it was too easy (for some) to manipulate Google’s rankings at the beginning of the decade. If you had enough ‘domain authority‘ you could use ‘thin content‘ to rank for anything. This is the definition of a ‘content farm‘.

Web spammers often used ‘unnatural‘ backlinks to build fake ‘domain authority‘ to rank this ‘thin‘ content.

I know I did, in the past.

Google Continually Raises The Bar For Us All To Meet

QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” Paul Haahr, Google 2016

Google decided to rank HIGH-QUALITY documents in its results and force those who wish to rank high to invest in higher-quality content or a great customer experience that creates buzz and attracts editorial links from reputable websites.

These high-quality signals are in some way based on Google being able to detect a certain amount of attention and effort put into your site and Google monitoring over time how users interact with your site.

These type of quality signals are much harder to game than they were in 2011, for instance.

Essentially, the ‘agreement’ with Google is if you’re willing to add a lot of great content to your website and create a buzz about your company, Google will rank you high above others who do not invest in this endeavour.

QUOTE: “high quality content is something I’d focus on. I see lots and lots of ….blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015

If you try to manipulate Google, it will penalise you for a period, and often until you fix the offending issue – which we know can LAST YEARS.

If you are a real business who intends to build a brand online and rely on organic traffic – you can’t use black hat methods. Full stop. It can take a LONG time for a site to recover from using black hat tactics and fixing the problems will not necessarily bring organic traffic back as it was before a penalty.

QUOTE: “Cleaning up these kinds of link issue can take considerable time to be reflected by our algorithms (we don’t have a specific time in mind, but the mentioned 6-12 months is probably on the safe side). In general, you won’t see a jump up in rankings afterwards because our algorithms attempt to ignore the links already, but it makes it easier for us to trust the site later on.” John Mueller, Google, 2018

Recovery from a Google penalty is a ‘new growth’ process as much as it is a ‘clean-up’ process.

Google Rankings Always Change

QUOTE: “Changing the layout of your pages can affect your search results. It can definitely affect SEO. And it can be a good thing, it can be a bad thing. It’s not that you need to avoid making these changes but rather when you make these changes make sure to double-check that you’re kind of doing everything really well.” John Muller, Google 2020

Google Rankings are in constant Ever-Flux, too.

Graph: Typical Google rankings 'ever flux'

Put simply – It’s Google’s job to MAKE MANIPULATING SERPs HARD. Its HARD to get to number 1 in Google for competitive keyword phrases.

So – the people behind the algorithms keep ‘moving the goalposts’, modifying the ‘rules’ and raising ‘quality standards’ for pages that compete for top ten rankings.

QUOTE: “Things can always change in search.” John Mueller, Google 2017

We have constant ever-flux in the SERPs – and that seems to suit Google and keep everybody guessing.

Fluctuating Rankings

QUOTE: “nobody will always see ranking number three for your website for those queries. It’s always fluctuating.” John Mueller, Google, 2015

This flux is not necessarily something to do with a problem per se and 

QUOTE: “fluctuations in search are normal and a sign that our algorithms & engineers are working hard.” John Mueller, Google 2016

Fluctuating upwards could be a good sign as he mentioned: “maybe this is really relevant for the first page, or maybe not.” – then again – the converse is true, one would expect.

 He says “a little bit of a push to make your website a little bit better, so that the algorithms can clearly say, yes, this really belongs on the first page.” which I thought was an interesting turn of phrase.  ‘First page’, rather than ‘number 1’.

Google is very secretive about its ‘secret sauce’ and offers sometimes helpful and sometimes vague advice – and some say offers misdirection – about how to get more from valuable traffic from Google.

Google is on record as saying the engine is intent on ‘frustrating’ attempts to improve the amount of high-quality traffic to a website – at least (but not limited to) – using low-quality strategies classed as web spam.

QUOTE: “If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.” Matt Cutts, Google 2013

At its core, Google search engine optimisation is still about keywords and links. It’s about relevancereputation and trust. It is about quality and intent of pagesvisitor satisfaction & increased user engagement. It is about users seeking your website out and completing the task they have to complete.

Good overall user experience is a key to winning – and keeping – the highest rankings in many verticals.

Relevance, Authority & Trust

QUOTE: “Know that ‘content’ and relevance’ are still primary.” Maile Ohye, Google 2010

Web page optimisation is STILL about making a web page relevant and trusted enough to rank for any given search query.

It’s about ranking for valuable keywords for the long term, on merit. You can play by ‘white hat’ rules lay down by Google, and aim to build this Authority and Trust naturally, over time, or you can choose to ignore the rules and go full time ‘black hat’.

MOST tactics still work, for some time, on some level, depending on who’s doing them, and how the campaign is deployed.

Whichever route you take, know that if Google catches you trying to modify your rank using overtly obvious and manipulative methods, then they will class you a web spammer, and your site will be penalised ( you will not rank high for relevant keywords).

QUOTE: “Those practices, referred to in the patent as “rank-modifying spamming techniques,” may involve techniques such as: Keyword stuffing, Invisible text, Tiny text, Page redirects, Meta tags stuffing, and Link-based manipulation.” Bill Slawski, 2012

These penalties can last years if not addressed, as some penalties expire and some do not – and Google wants you to clean up any violations – even historic violations.

QUOTE: “If parts of your website don’t comply with our webmaster guidelines, and you want to comply with our webmaster guidelines, then it doesn’t matter how old those non-compliant parts are.” John Mueller, Google 2020

Google does not want you to try and modify where you rank, easily. Critics would say Google would prefer you paid them to do that using Google Adwords.

The problem for Google is – ranking high in Google organic listings is a real social proof for a business, a way to avoid PPC costs and still, simply, the BEST WAY to drive VALUABLE traffic to a site.

It’s FREE, too, once you’ve met the always-increasing criteria it takes to rank top.

UX; ‘User Experience‘ Does Matter

Is User Experience A Ranking Factor?

User experience is mentioned many times in the Quality Raters Guidelines but we have been told by Google it is not, per say, a classifiable ‘ranking factor‘ on desktop search, at least.

QUOTE: “On mobile, sure, since UX is the base of the mobile friendly update. On desktop currently no. (Gary Illyes: Google, May 2015)

While UX (User Experience), we are told, is not literally a ‘ranking factor’, it is useful to understand exactly what Google calls a ‘poor user experience‘ because if any poor UX signals are identified on your website, that is not going to be a healthy thing for your rankings anytime soon.

Matt Cutts (no longer with Google) consistent advice was to focus on a very satisfying user experience.

What is Bad UX?

For Google – rating UX, at least from a quality rater’s perspective, revolves around marking the page down for:

  • misleading or potentially deceptive design
  • sneaky redirects
  • malicious downloads and
  • spammy user-generated content (unmoderated comments and posts)
  • low-quality MC (main content of the page)
  • low-quality or distracting SC (supplementary content)

In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating scores differ from, for instance,  W3c Mobile testing scores.

Google is still, evidently, more interested in rating the main content of the webpage in question and the reputation of the domain the page is on – relative to your site, and competing pages on other domains.

A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school stuff like keyword stuffing a site.

If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school techniques – those certainly are positive steps to getting more traffic from Google – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.

When I hear Google talking about user experience, I often wonder…. is Google talking about AD EXPERIENCE?

In some cases, I think so!

QUOTE: “At Google we are aiming to provide a great user experience on any device, we’re making a big push to ensure the search results we deliver reflect this principle.” Google 2014

There is no single ‘user experience‘ ranking factor, we have been told, however poor user experience clearly does not lead to high rankings in Google.

QUOTE: “I don’t think we even see what people are doing on your website if they’re filling out forms or not if they’re converting to actually buying something so if we can’t really see that then that’s not something that we’d be able to take into account anyway. So from my point of view that’s not something I’d really treat as a ranking factor. Of course if people are going to your website and they’re filling out forms or signing up … for a newsletter then generally that’s a sign that you’re doing the right things.”. John Mueller, Google 2015

Note what Google labels ‘user experience‘ may differ from how others define it.

Take for instance a slow page load time, which is a poor user experience:

QUOTE: “We do say we have a small factor in there for pages that are really slow to load where we take that into account.” John Mueller,Google, 2015

or sites that don’t have much content “above-the-fold”:

QUOTE: “So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.” Google 2012

These are user experience issues Google penalises for, but as another Google spokesperson pointed out:

QUOTE: “Rankings is a nuanced process and there is over 200 signals.” Maile Ohye, Google 2010

If you expect to rank in Google organic listings you’d better have a quality offering, not based entirely on manipulation, or old school tactics.

Is a visit to your site a good user experience? Is it performing its task better than the competition?

If not – beware the manual ‘Quality Raters’ and beware the Google Panda/Site Quality algorithms that are looking to demote sites based on what we can easily point to as poor user experience signals and unsatisfying content when it is presented to Google’s users.

Google raising the ‘quality bar’, year on year, ensures a higher level of quality in online marketing in general (above the very low-quality we’ve seen over the last years).

Success in organic marketing involves investment in higher quality on-page content, better website architecture, improved usability, intelligent conversion to optimisation balance, and ‘legitimate’ internet marketing techniques.

QUOTE: “If you make a good website that works well for users then indirectly you can certainly see an effect in ranking.” John Mueller,  Google 2019

If you don’t take that route, you’ll find yourself chased down by Google’s algorithms at some point in the coming year.

This guide (and this entire website) is not about churn and burn type of webspam as that is too risky to deploy on a real business website, which is:

QUOTE: “ like walking into a dark alley, packed with used car salesmen, who won’t show you their cars.” Matt Cutts, Google 2014

UX; User Experience Across Multiple Devices & Screen Resolutions MATTERS

User Experience is important.

When thinking of designing a web-page you are going to have to consider where certain elements appear on that webpage, especially advertisements.

Google will (very infrequently, in my experience) tell you if the ads on your website are annoying users which may impact the organic traffic Google sends you.

Annoying ads on your web pages has long been a problem for users (probably) and Google, too. Even if they do make you money.

What ads are annoying?

  • ‘ the kind that blares music unexpectedly ‘ or
  • ‘ a pop-up on top of the one thing we’re trying to find ‘

Apparently ‘frustrating experiences can lead people to install ad blockers and when ads are blocked publishers can’t make money’.

The video goes on to say:

QUOTE: a survey of hundreds of ad experiences by the Coalition for better ads has shown that people don’t hate all ads just annoying ones eliminating these ads from your site can make a huge difference ‘ Google, 2017

The New Ad Experience Report In Google Search Console

The Ad Experience Report is part of Google Search Console, but I would NOT wait for any notification from Google before you review your own pages and determine if you have too many ads on your page.

The report:

QUOTE: ‘ makes it easy to find annoying ads on your site and replace them with user-friendly ones ‘. Google, 2017

I think there are ranking algorithms out there that deal with too many ads on the page.

Avoid them by NOT having too many ads on the page.

UX; Which type of Adverts Annoys Users?

Google states:

QUOTE: “The Ad Experience Report is designed to identify ad experiences that violate the Better Ads Standards, a set of ad experiences the industry has identified as being highly annoying to users. If your site presents violations, the Ad Experience Report may identify the issues to fix.” Google Webmaster Guidelines 2020

The Better Ads Standards people are focused on the following annoying ads:

Desktop Web Experiences

Which type of ads on desktop devices annoy users the most?

  • Pop-up Ads
  • Auto-playing Video Ads with Sound
  • Prestitial Ads with Countdown
  • Large Sticky Ads

Mobile Web Experiences

Which type of ads on mobile devices annoy users the most?

  • Pop-up Ads
  • Auto-playing Video Ads with Sound
  • Prestitial Ads
  • Postitial Ads with Countdown
  • Ad Density Higher Than 30%
  • Full-screen Scroll through Ads
  • Flashing Animated Ads
  • Large Sticky Ads

Google says in the video:

QUOTE: “Fixing the problem depends on the issue you have. For example, if it’s a pop-up, you’ll need to remove all the pop-up ads from your site. But if the issue is high ad density on a page, you’ll need to reduce the number of ads. Once you fix the issues, you can submit your site for a re-review. We’ll look at a new sample of pages and may find ad experiences that were missed previously. We’ll email you when the results are in.” Google, 2017

Google offers some solutions to using pop-ups if you are interested

QUOTE: “In place of a pop-up try a full-screen inline ad. It offers the same amount of screen real estate as pop-ups without covering up any content. Fixing the problem depends on the issue you have for example if it’s a pop-up you’ll need to remove all the pop-up ads from your site but if the issue is high ad density on a page you’ll need to reduce the number of ads” Google, 2017

Your Website Will Receive A LOW RATING If It Has Annoying Or Distracting Ads or annoying Secondary Content (SC)

Google has long warned about web page advertisements and distractions on a web page that results in a poor user experience.

The following specific examples are taken from the Google Search Quality Evaluator Guidelines 2017.

6.3 Distracting/Disruptive/Misleading Titles, Ads, and Supplementary Content

QUOTE: “Some Low-quality pages have adequate MC (main content on the page) present, but it is difficult to use the MC due to disruptive, highly distracting, or misleading Ads/SC. Misleading titles can result in a very poor user experience when users click a link only to find that the page does not match their expectations.” Google Search Quality Evaluator Guidelines 2015

6.3.1 Ads or SC that disrupt the usage of MC

QUOTE: “We expect Ads and SC to be visible. However, some Ads, SC, or interstitial pages (i.e., pages displayed before or after the content you are expecting) make it difficult to use the MC. Pages with Ads, SC, or other features that distract from or interrupt the use of the MC should be given a Low rating.” Google Search Quality Evaluator Guidelines 2019

Google gave some examples:

  • QUOTE: “‘Ads that actively float over the MC as you scroll down the page and are difficult to close. It can be very hard to use MC when it is actively covered by moving, difficult­-to-­close Ads.’
  • QUOTE: “‘An interstitial page that redirects the user away from the MC without offering a path back to the MC.’

6.3.2 Prominent presence of distracting SC or Ads

Google said:

QUOTE: ““Users come to web pages to use the MC. Helpful SC and Ads can be part of a positive user experience, but distracting SC and Ads make it difficult for users to focus on and use the MC.

Some webpages are designed to encourage users to click on SC that is not helpful for the purpose of the page. This type of SC is often distracting or prominently placed in order to lure users to highly monetized pages.

Either porn SC or Ads containing porn on non­Porn pages can be very distracting or even upsetting to users. Please refresh the page a few times to see the range of Ads that appear, and use your knowledge of the locale and cultural sensitivities to make your rating. For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits. However, an extremely graphic porn ad may warrant a Low (or even Lowest) rating.” Google Search Quality Evaluator Guidelines 2017

6.3.3 Misleading Titles, Ads, or SC

Google said:

QUOTE: “It should be clear what parts of the page are MC, SC, and Ads. It should also be clear what will happen when users interact with content and links on the webpage. If users are misled into clicking on Ads or SC, or if clicks on Ads or SC leave users feeling surprised, tricked or confused, a Low rating is justified.

  • At first glance, the Ads or SC appear to be MC. Some users may interact with Ads or SC, believing that the Ads or SC is the MC.Ads appear to be SC (links) where the user would expect that clicking the link will take them to another page within the same website, but actually take them to a different website. Some users may feel surprised or confused when clicking SC or links that go to a page on a completely different website.
  • Ads or SC that entice users to click with shocking or exaggerated titles, images, and/or text. These can leave users feeling disappointed or annoyed when they click and see the actual and far less interesting content.
  • Titles of pages or links/text in the SC that are misleading or exaggerated compared to the actual content of the page. This can result in a very poor user experience when users read the title or click a link only to find that the page does not match their expectations. “ Google Search Quality Evaluator Guidelines 2017

The important thing to know here is:

QUOTE: “Summary: The Low rating should be used for disruptive or highly distracting Ads and SC. Misleading Titles, Ads, or SC may also justify a Low rating. Use your judgment when evaluating pages. User expectations will differ based on the purpose of the page and cultural norms.” Google Search Quality Evaluator Guidelines 2017

… and that Google does not send free traffic to sites it rates as low quality.

QUOTE: “Important: The Low rating should be used if the page has Ads, SC, or other features that interrupt or distract from using the MC.” Google Search Quality Evaluator Guidelines 2019

Recommendation: Remove annoying ADS/SC/CTA from your site. Be extremely vigilant that your own CTA for your own business doesn’t get in the way of a user consuming the main content either.

I believe your own CTA (Call-To-Action) on your own site can be treated much like ADs and SC are, and open to the same abuse and same punishments that Ads are, depending on the page-type we are talking about.

Balancing Conversions With Usability & User Satisfaction

Take pop-up windows or window pop-unders as an example:

According to usability expert Jakob Nielsen, 95% of website visitors hated unexpected or unwanted pop-up windows, especially those that contain unsolicited advertising.

In fact, Pop-Ups have been consistently voted the Number 1 Most Hated Advertising Technique since they first appeared many years ago.

QUOTE: “Some things don’t change — users’ expectations, in particular. The popups of the early 2000s have reincarnated as modal windows, and are hated just as viscerally today as they were over a decade ago. Automatically playing audio is received just as negatively today. The following ad characteristics remained just as annoying for participants as they were in the early 2000s: Pops up – Slow loading time – Covers what you are trying to see – Moves content around – Occupies most of the page – Automatically plays sound.” Therese Fessenden, Nielsen Norman Group 2017

Website accessibility aficionados will point out:

  • creating a new browser window should be the authority of the user
  • pop-up new windows should not clutter the user’s screen.
  • all links should open in the same window by default. (An exception, however, may be made for pages containing a links list. It is convenient in such cases to open links in another window so that the user can come back to the links page easily. Even in such cases, it is advisable to give the user a prior note that links would open in a new window).
  • Tell visitors they are about to invoke a pop-up window (using the link <title> attribute)
  • Popup windows do not work in all browsers.
  • They are disorienting for users
  • Provide the user with an alternative.

It is, however, an inconvenient truth for accessibility and usability aficionados to hear that pop-ups can be used successfully to vastly increase signup subscription conversions.

QUOTE: “While, as a whole, web usability has improved over these past several years, history repeats and designers make the same mistakes over and over again. Designers and marketers continuously need to walk a line between providing a good user experience and increasing advertising revenue. There is no “correct” answer or golden format for designers to use in order to flawlessly reach audiences; there will inevitably always be resistance to change and a desire for convention and predictability. That said, if, over the course of over ten years, users are still lamenting about the same problems, it’s time we start to take them seriously.”  Therese Fessenden, Nielsen Norman Group 2017

EXAMPLE: TEST With Using A Pop Up Window

Pop-ups suck, everybody seems to agree. Here’s the little test I carried out on a subset of pages, an experiment to see if pop-ups work on this site to convert more visitors to subscribers.

I  tested it out when I didn’t blog for a few months and traffic was very stable.

Results:

Testing Pop Up Windows Results

Pop Up WindowTotal %Change
WK1 OnWk2 Off
Mon4620173%
Tue4823109%
Wed4115173%
Thu4823109%
Fri5217206%

That’s a fair increase in email subscribers across the board in this small experiment on this site. Using a pop up does seem to have an immediate impact.

I have since tested it on and off for a few months and the results from the small test above have been repeated over and over.

I’ve tested different layouts and different calls to actions without pop-ups, and they work too, to some degree, but they typically take a bit longer to deploy than activating a plugin.

I don’t really like pop-ups as they have been an impediment to web accessibility but it’s stupid to dismiss out-of-hand any technique that works.

In my tests, using pop-ups really seemed to kill how many people share a post in social media circles.

With Google now showing an interest with interstitials (especially on mobile versions of your site), I would be very nervous about employing a pop-up window that obscures the primary reason for visiting the page.

If Google detects any user dissatisfaction, this can be very bad news for your rankings.

QUOTE: “Interstitials that hide a significant amount of content provide a bad search experience” Google, 2015

I am, at the moment, using an exit strategy pop-up window as hopefully by the time a user sees this device, they are FIRST satisfied with my content they came to read. I can recommend this as a way to increase your subscribers, at the moment, with a similar conversion rate than pop-ups – if NOT BETTER.

I think it is sensible to convert customers without using techniques that potentially negatively impact Google rankings.

Do NOT let conversion get in the way of the PRIMARY reason a Google visitor is on ANY PARTICULAR PAGE or you risk Google detecting relative dissatisfaction with your site and that is not going to help you as Google’s gets better at working out what ‘quality’ actually means.

How To Fix Issues Found In the Google Ad Experience Report

Chances are, you will NOT receive a message in Search Console for this. I’ve happened across very few, even when I knew there were too many ads on pages, and there probably should be a message. I think the algorithm sorts this stuff out.

If you have a message from Google:

  • you will need to sign up for Google Search Console
  • review the Ad experience report
  • if your site hasn’t been reviewed or as past review, the report won’t show anything
  • if your review status is warning or failing violations will be listed in the “what we found” column “ad reviews report” on a sample of pages from both desktop and mobile versions of your site
  • ‘if negative ad experiences are found they are listed separately in the report since a bad experience on mobile may not be as annoying on desktop’
  • Google will highlight ‘site design issues such as pop-ups or large sticky ads‘ and rather cleverly will show you ‘a video of the ad that was flagged
  • ‘Creative issues are shown on your site through ad tags like flashing animated ads or autoplay videos with sound’
  • remove annoying ads from your site
  • submit your site for a review of your ad experience in Search Console.

UX; Where You Place Ads On Your Page Can Impact Rankings

QUOTE: “For years, the user experience has been tarnished by irritating and intrusive ads. Thanks to extensive research by the Coalition for Better Ads, we now know which ad formats and experiences users find the most annoying. Working from this data, the Coalition has developed the Better Ads Standards, offering publishers and advertisers a road map for the formats and ad experiences to avoid.”  Kelsey LeBeau, Google 2019

I proceed thinking conversion points on a page (ADs or CTA) must be thought about as an AD and placed in the most appropriate place on the page following the principles Google has told us about.

I recommend ANY FUTURE CONVERSION PRINCIPLES added to your site should be checked against these guidelines:

ONLY Place conversion points (CTA – CALL TO ACTIONS) in the GREEN places and BEWARE placing any CTA in any bad ad placement area, on mobile or desktop.

UX; Google has a ‘Page-Heavy’ Penalty Algorithm

Present your most compelling material above the fold at any resolution – Google also has a ‘Page Heavy Algorithm’ – In short, if you have too many ads on your page, or if paid advertising obfuscates copy or causes an otherwise frustrating user experience for Google’s visitors, your page can be demoted in SERPs:

QUOTE: ‘sites that don’t have much content “above-the-fold” can be affected.’  Matt Cutts, Google 2012:

Keep an eye on where you put your ads, sponsored content or CTA – get in the way of your main content copy of the page you are designing and you could see organic traffic decline.

UX; Google has an ‘Interstitial and Pop-Up‘ Penalty Algorithm

Bear in mind also Google now (since January 2017) has an Interstitial and Pop-Up ‘penalty so AVOID creating a marketing strategy that relies on this.

QUOTE: “Here are some examples of techniques that make content less accessible to a user:

(1) Showing a popup that covers the main content, either immediately after the user navigates to a page from the search results, or while they are looking through the page.

(2) Displaying a standalone interstitial that the user has to dismiss before accessing the main content.

(3) Using a layout where the above-the-fold portion of the page appears similar to a standalone interstitial, but the original content has been inlined underneath the fold. Google” Doantam Phan, Google 2016

EXIT POP-UPS (like the one I use on this site) do not seem to interfere with a readers enjoyment and access to the primary content on a page. At the moment, this type of pop-up seems to be OK for now (and do increase subscriber signups, too). I have a feeling an exit-pop up might have a negative impact on some rankings, though; the jury is out.

UX; Core Web Vitals: “Page Experience” Is The New “User Experience

I think this is a good move to call this “page experience” and not confuse it with “user experience”:

QUOTE: “We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.” Sowmya Subramanian, Google 2020

This is an incredibly important move by Google.

QUOTE: “Web Vitals is an initiative by Google to provide unified guidance for quality signals that are essential to delivering a great user experience on the web.” Philip Walton, Engineer at Google 2020

SEOs must now focus on these website issues:

  • LCP — Largest Contentful Paint
  • FID — First Input Delay
  • CLS — Cumulative Layout Shift

Optimise your core web vitals now. These become official ranking factors soon.

Google Is Switching to ‘Mobile First‘ Indexing

QUOTE: “Our initial plan was to enable mobile-first indexing for all sites in Search in September 2020. We realize that in these uncertain times, it’s not always easy to focus on work as otherwise, so we’ve decided to extend the timeframe to the end of March 2021. At that time, we’re planning on switching our indexing over to mobile-first indexing.” Google, 2020

Now that Google is determined to focus on ranking sites based on their mobile experience, the time is upon businesses to REALLY focus on delivering the fastest and most accessible DESKTOP and MOBILE friendly experience you can achieve.

Because if you DO NOT, your competition will, and Google may rank those pages above your own, in time.

QUOTE: “To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices. If you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything.” Doantam Phan, Google 2017

Do not think you can just ignore the desktop version of your site, even if you are now mobile first, according to Google Search Console.

QUOTE: “Google uses two different crawlers for crawling websites: a mobile crawler and a desktop crawler. Each crawler type simulates a user visiting your page with a device of that type. Google uses one crawler type (mobile or desktop) as the primary crawler for your site. All pages on your site that are crawled by Google are crawled using the primary crawler. The primary crawler for all new websites is the mobile crawler. In addition, Google recrawls a few pages on your site with the other crawler type (mobile or desktop). This is called the secondary crawl, and is done to see how well your site works with the other device type.” Google Webmaster Guidelines, 2020

I recently vastly improved a site rankings by dealing with poor AD/CTA placement issues across a website DESKTOP version (primarily).

QUOTE: “Mobile-first indexing means Google predominantly uses the mobile version of the content for indexing and ranking. Historically, the index primarily used the desktop version of a page’s content when evaluating the relevance of a page to a user’s query. Since the majority of users now access Google Search with a mobile device, Googlebot primarily crawls and indexes pages with the smartphone agent going forward.” Google Official Documentation, 2020

What Are YMYL Pages?

QUOTE: “Some types of pages or topics could potentially impact a person’s future happiness, health, financial stability, or safety. We call such pages “Your Money or Your Life” pages, or YMYL.” Google Search Quality Evaluator Guidelines 2019

YMYL pages are, I think, what a lot of Google algorithms ultimately focus on dealing with.

Google classifies web pages that “potentially impact a person’s future happiness, health, financial stability, or safety” as “Your Money or Your Life” pages (YMYL) and hold these types of pages to higher standards than, for instance, hobby and informational sites.

Essentially, if you are selling something to visitors or advising on important matters like finance, law or medical advice – your page will be held to this higher standard.

YMYL is a classification of certain pages by Google where Google explains:

QUOTE: “News and current events: news about important topics such as international events, business, politics, science, technology, etc. Keep in mind that not all news articles are necessarily considered YMYL (e.g., sports, entertainment, and everyday lifestyle topics are generally not YMYL). Please use your judgment and knowledge of your locale. ● Civics, government, and law: information important to maintaining an informed citizenry, such as information about voting, government agencies, public institutions…. and legal issues (e.g., divorce, child custody, adoption, creating a will, etc.). ● Finance: financial advice or information regarding investments, taxes, retirement planning, loans, banking, or insurance, particularly webpages that allow people to make purchases or transfer money online. ● Shopping: information…. related to research or purchase of goods….  particularly webpages that allow people to make purchases online. ● Health and safety: advice or information about medical issues, drugs, hospitals, emergency preparedness, how dangerous an activity is, etc. ● Groups of people: information about or claims related to groups of people, including but not limited to those grouped on the basis of race or ethnic origin, religion, disability, age, nationality, veteran status, sexual orientation, gender or gender identity. ● Other: there are many other topics related to big decisions or important aspects of people’s lives which thus may be considered YMYL, such as fitness and nutrition, housing information, choosing a college, finding a job, etc.” Google Search Quality Evaluator Guidelines 2019

As soon as you have YMYL INTENT pages on your page/site, Google will detect this, will treat this site differently and will treat all pages linking to this page differently.

You have been warned!

What Is E.A.T.?

Google aims to rank pages where the author has some demonstrable expertise on experience in the subject-matter they are writing about. These ‘quality ratings’ (performed by human evaluators) are based on (E.A.T. or EAT or E-A-T) which is simply  ‘Expertise, Authoritativeness, Trustworthiness‘ of the ‘Main Content of a page

QUOTE: “Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. …. Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important..” Google Search Quality Evaluator Guidelines 2019

and

QUOTE: “Keep in mind that there are high E-A-T pages and websites of all types, even gossip websites, fashion websites, humor websites, forum and Q&A pages, etc. In fact, some types of information are found almost exclusively on forums and discussions, where a community of experts can provide valuable perspectives on specific topics. ● High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis. ● High E-A-T news articles should be produced with journalistic professionalism—they should contain factually accurate content presented in a way that helps users achieve a better understanding of events. High E-A-T news sources typically have published established editorial policies and robust review processes (example 1, example 2). ● High E-A-T information pages on scientific topics should be produced by people or organizations with appropriate scientific expertise and represent well-established scientific consensus on issues where such consensus exists. ● High E-A-T financial advice, legal advice, tax advice, etc., should come from trustworthy sources and be maintained and updated regularly. ● High E-A-T advice pages on topics such as home remodeling (which can cost thousands of dollars and impact your living situation) or advice on parenting issues (which can impact the future happiness of a family) should also come from “expert” or experienced sources that users can trust. ● High E-A-T pages on hobbies, such as photography or learning to play a guitar, also require expertise. Some topics require less formal expertise. Many people write extremely detailed, helpful reviews of products or restaurants. Many people share tips and life experiences on forums, blogs, etc. These ordinary people may be considered experts in topics where they have life experience. If it seems as if the person creating the content has the type and amount of life experience to make him or her an “expert” on the topic, we will value this “everyday expertise” and not penalize the person/webpage/website for not having “formal” education or training in the field.” Google Search Quality Evaluator Guidelines 2019

Who links to you can also inform the E-A-T of your website.

Consider this:

QUOTE: “I asked Gary (Illyes from Google) about E-A-T. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that’s good. He recommended reading the sections in the QRG on E-A-T as it outlines things well.” Marie Haynes, Pubcon 2018

Google is still a ‘link-based’ search engine ‘under-the-hood’ but it takes so much more to stick a website at the top of search engine results pages (SERPs) than it used to.

What is the PURPOSE of your page?

This is incredibly important:

QUOTE: “The purpose of a page is the reason or reasons why the page was created. Every page on the Internet is created for a purpose, or for multiple purposes. Most pages are created to be helpful for users, thus having a beneficial purpose. Some pages are created merely to make money, with little or no effort to help users. Some pages are even created to cause harm to users. The first step in understanding a page is figuring out its purpose.” Google Search Quality Evaluator Guidelines 2019

Is it to “sell product”, “to entertain” or “ to share information about a topic.”

MAKE THE PURPOSE OF YOUR PAGE SINGULAR and OBVIOUS to help quality raters and algorithms.

The name of the game (if you’re not faking everything) is VISITOR SATISFACTION.

If a visitor lands on your page – are they satisfied and can they successfully complete WHY they are there? Are you trying to hard-sell visitors as soon as they land on your information-type page?

TIP: On information-type pages, test out different layouts where you place the content front and centre.

If A Page Exists Only To Make Money Online, The Page Is Spam, to Google

QUOTE: “If A Page Exists Only To Make Money, The Page Is Spam” Google Quality Rater Guide 2011

That statement above in the original quality rater guidelines is standout and should be a heads up to any webmaster out there who thinks they are going to make a “fast buck” from Google organic listings.

It should, at least, make you think about the types of pages you are going to spend your valuable time making.

Without VALUE ADD for Google’s users – don’t expect to rank high for commercial keywords.

If you are making a page today with the sole purpose of making money from it – and especially with free traffic from Google – you obviously didn’t get the memo.

Consider this statement from a manual reviewer:

QUOTE: “…when they DO get to the top, they have to be reviewed with a human eye in order to make sure the site has quality.” Google Human Rater, 2011

It’s worth remembering:

  • If A Page Exists Only To Make Money, The Page Is Spam
  • If A Site Exists Only To Make Money, The Site Is Spam

This is how what you make will be judged – whether it is fair to you or not.

QUOTE: “What makes a page spammy?: “Hidden text or links – may be exposed by selecting all page text and scrolling to the bottom (all text is highlighted), disabling CSS/Javascript, or viewing source code. Sneaky redirects – redirecting through several URLs, rotating destination domains cloaking with JavaScript redirects and 100% frame. Keyword stuffing – no percentage or keyword density given; this is up to the rater. PPC ads that only serve to make money, not help users. Copied/scraped content and PPC ads. Feeds with PPC ads. Doorway pages – multiple landing pages that all direct user to the same destination. Templates and other computer-generated pages mass-produced, marked by copied content and/or slight keyword variations. Copied message boards with no other page content. Fake search pages with PPC ads. Fake blogs with PPC ads, identified by copied/scraped or nonsensical spun content. Thin affiliate sites that only exist to make money, identified by checkout on a different domain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question. Pure PPC pages with little to no content. Parked domains” Miranda Miller, SEW, 2011

It isn’t all bad news…

QUOTE: “An infinite number of niches are waiting for someone to claim them. I’d ask yourself where you want to be, and see if you can find a path from a tiny specific niche to a slightly bigger niche and so on, all the way to your desired goal. Sometimes it’s easier to take a series of smaller steps instead of jumping to your final goal in one leap.” Matt Cutts, Google 2006

Google does levels the playing field in some areas especially if you are willing to:

  • Differentiate yourself
  • Be Remarkable
  • Be accessible
  • Add unique content to your site
  • Help users in an original way
  • Not over-emphasise conversion-rate practices on information type content

Google doesn’t care about the vast majority of websites but the search engine giant DOES care about HELPING ITS OWN USERS.

So, if you are helping visitors the come from Google – and not by just directing them to another website – you are probably doing one thing right at least.

With this in mind – I am already building affiliate pages differently, for instance.

In my experience, INFORMATION-TYPE pages are treated differently to (even) INFORMATION-RICH YMYL pages, simply because of the obvious intent to bait and switch or monetise free traffic from Google

I proceed thinking monetising a page effectively turns it into a YMYL-Light page (which means you better have a lot of E.A.T. to make it fly).

Doorway Pages Are Spam

Google algorithms consistently target sites with doorway pages in quality algorithm updates. The definition of a “doorway page” can change over time.

For example in the images below (from 2011), all pages on the site seemed to be hit with a -50+ ranking penalty for every keyword phrase the website ranked for.

At first Google rankings for commercial keyword phrases collapsed which led to somewhat of a “traffic apocalypse“:

Google detected doorway pages penalty

The webmaster at the time received an email from Google via Search Console:

QUOTE: “Notice of detected doorway pages on xxxxxxxx – Dear site owner or webmaster of xxxxxxxx, We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages – groups of “cookie cutter” or low-quality pages…. We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.” Google Search Quality Team, 2011

At the time, I didn’t immediately class the pages on the affected sites in question as doorway pages. It’s evident Google’s definition of a doorways changes over time.

And it has! Google has been very clear in recent years.

A lot of people do not realise they are building what Google classes as doorway pages….. and it was indicative to me that ….. what you intend to do with the traffic Google sends you may in itself, be a ranking factor not too often talked about.

What Does Google Classify As Doorway Pages?

Google classes many types of pages as doorway pages.

Doorway pages can be thought of as lots of pages on a website designed to rank for very specific keywords using minimal original text content e.g. location pages often end up looking like doorway pages.

In the recent past, location-based SERPs were often lower-quality, and so Google historically ranked location-based doorway pages in many instances.

There is some confusion for real businesses who THINK they SHOULD rank for specific locations where they are not geographically based and end up using doorway-type pages to rank for these locations.

What Google Says About Doorway Pages

Google said a few years ago:

QUOTE: “For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn’t like it, and then tries the next result in the search results page and is taken to that same site that they didn’t like, that’s a really frustrating experience.” Brian White, Google 2015

A question about using content spread across multiple pages and targeting different geographic locations on the same site was asked in the recent Hangout with Google’s John Mueller:

QUOTE: “the content that will be presented in terms of the clinics that will be listed looking fairly similar right and the same I think holds true if you look at it from the location …… we’re conscious that this causes some kind of content duplication so the question is is this type … to worry about? ” Webmaster Question, 2017

Bearing in mind that (while it is not the optimal use of pages) Google does not ‘penalise’ a website for duplicating content across internal pages in a non-malicious way, John’s clarification of location-based pages on a site targeting different regions is worth noting:

QUOTE: “For the most part it should be fine I think the the tricky part that you need to be careful about is more around doorway pages in the sense that if all of these pages end up with the same business then that can look a lot like a doorway page but like just focusing on the content duplication part that’s something that for the most part is fine what will happen there is will index all of these pages separately because from  from a kind of holistic point of view these pages are unique they have unique content on them they might have like chunks of text on them which are duplicated but on their own these pages are unique so we’ll index them separately and in the search results when someone is searching for something generic and we don’t know which of these pages are the best ones we’ll pick one of these pages and show that to the user and filter out the other variations of that that page so for example if someone in Ireland is just looking for dental bridges and you have a bunch of different pages for different kind of clinics ….. and probably will pick one of those pages and show those in the search results and filter out the other ones.

But essentially the idea there is that this is a good representative of the the content from your website and that’s all that we would show to users on the other hand if someone is specifically looking for let’s say dental bridges in Dublin then we’d be able to show the appropriate clinic that you have on your website that matches that a little bit better so we’d know dental bridges is something that you have a lot on your website and Dublin is something that’s unique to this specific page so we’d be able to pull that out and to show that to the user like that so from a pure content duplication point of view that’s not really something I totally worry about.

I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.” John Mueller, Google 2017

The takeaway here is that if you have LOTS of location pages serving ONE SINGLE business in one location, then those are very probably classed as some sort of doorway pages, and probably old-school techniques for these type of pages will see them classed as lower-quality – or even – spammy pages.

Google has long warned webmasters about using Doorway pages but many sites still employ them, because, either:

  • their business model depends on it for lead generation
  • the alternative is either a lot of work or
  • they are not creative enough or
  • they are not experienced enough to avoid the pitfalls of having lower-quality doorway pages on a site or
  • they are experienced enough to understand what impact they might be having on any kind of site quality score

Google has a doorway page algorithm which no doubt they constantly improve upon. Google warned:

QUOTE: “Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.” Google, 2015

If you have location pages that serve multiple locations or businesses, then those are not doorway pages and should be improved uniquely to rank better, according to John’s advice.

Are You Making Doorway Pages?

Search Engine Land offered this clarification from Google:

QUOTE: “How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions:

  • Is the purpose to optimise for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
  • Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
  • Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
  • Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
  • Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?” Barry Schwartz, 2015

Also:

QUOTE: “Well, a doorway page would be if you have a large collection of pages where you’re just like tweaking the keywords on those pages for that.

I think if you focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword then that’s that’s usually something that leads to a reasonable result.

Whereas if you’re just taking a list of keywords and saying I need to make pages for each of these keywords and each of the permutations that might be for like two or three of those keywords then that’s just creating pages for the sake of keywords which is essentially what we look at as a doorway.” Barry Schwartz, 2015

Note I highlighted the following statement:

QUOTE: “focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword.”

That is because sometimes, often, in fact, there is an alternative to doorway pages for location pages that achieve essentially the same thing for webmasters.

Naturally, business owners want to rank for lots of keywords in organic listings with their website. The challenge for webmasters is that Google doesn’t want business owners to rank for lots of keywords using auto-generated content especially when that produces A LOT of pages on a website using (for instance) a list of keyword variations page-to-page.

QUOTE: “7.4.3 Automatically ­Generated Main Content Entire websites may be created by designing a basic template from which hundreds or thousands of pages are created, sometimes using content from freely available sources (such as an RSS feed or API). These pages are created with no or very little time, effort, or expertise, and also have no editing or manual curation. Pages and websites made up of auto­generated content with no editing or manual curation, and no original content or value added for users, should be rated Lowest.” Google Search Quality Evaluator Guidelines, 2017

The end-result is webmasters create doorway pages without even properly understanding what they represent to Google and without realising Google will not index all these auto-generated pages:

QUOTE: “Doorway pages (bridge pages, portal pages, jump pages, gateway pages or entry pages) are web pages that are created for the deliberate manipulation of search engine indexes (spamdexing).” Wikipedia, 2020

Also:

QUOTE: In digital marketing and online advertising, spamdexing (web spam)…. is the deliberate manipulation of search engine indexes” Wikipedia, 2020

It is interesting to note that Wikipedia might make clear distinctions between what doorway page is, what spamdexing is and what a landing page is…

QUOTE: “Landing pages are regularly misconstrued to equate to Doorway pages within the literature. The former are content rich pages to which traffic is directed within the context of pay-per-click campaigns…” Wikipedia, 2020

For me, Google blurs that line here.

Google has declared:

QUOTE: Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.

Here are some examples of doorways:

  • Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
  • Pages generated to funnel visitors into the actual usable or relevant portion of your site(s)
  • Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy”
    Google Webmaster Guidelines, 2020

I’ve made bold:

QUOTE: “Doorways are sites or pages created to rank highly for specific search queries” Google Webmaster Guidelines, 2020

Take note: It is not just location pages that are classed as doorway pages:

QUOTE: “For Google, that’s probably overdoing it and ends up in a situation you basically create a doorway site …. with pages of low value…. that target one specific query.” John Mueller 2018

If your website is made up of lower-quality doorway type pages using old techniques (which more and more labelled as spam) then Google will not index all of the pages and your website ‘quality score’ is probably going to be negatively impacted.

Google’s John Mueller advised someone that:

QUOTE: “he should not go ahead and build out 1,300 city based landing pages, with the strategy of trying to rank for your keyword phrase + city name. He said that would be a doorway page and against Google’s guidelines.” Barry Schwartz, 2019

Summary:

If you are making keyword rich location pages for a single business website, there’s a risk these pages will be classed doorway pages.

If you know you have VERY low-quality doorway pages on your site, you should remove them or rethink your entire strategy if you want to rank high in Google for the long term.

Location-based pages are suitable for some kind of websites, and not others.

What Does Google Mean By “Low-Quality“?

Google has a history of classifying your site as some type of entity, and whatever that is, you don’t want a low-quality label on it. Put there by algorithm or human. Manual evaluators might not directly impact your rankings, but any signal associated with Google marking your site as low-quality should probably be avoided.

If you are making websites to rank in Google without unnatural practices, you are going to have to meet Google’s expectations in the Quality Raters Guidelines.

Google says:

QUOTE: “Low-quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well.” Google Search Quality Evaluator Guidelines 2019

‘Sufficient Reason’

There is ‘sufficient reason’ in some cases to immediately mark the page down in some areas, and Google directs quality raters to do so:

  • QUOTE: “An unsatisfying amount of MC is a sufficient reason to give a page a Low-quality rating.
  • QUOTE: “Low-quality MC is a sufficient reason to give a page a Low-quality rating.
  • QUOTE: “Lacking appropriate E-A-T is sufficient reason to give a page a Low-quality rating.
  • QUOTE: “Negative reputation is sufficient reason to give a page a Low-quality rating.

What are low-quality pages?

When it comes to defining what a low-quality page is, Google is evidently VERY interested in the quality of the Main Content (MC) of a page:

Main Content (MC)

Google says MC should be the ‘main reason a page exists’.

  • QUOTE: “The quality of the MC is low.
  • QUOTE: “There is an unsatisfying amount of MC for the purpose of the page.
  • QUOTE: “There is an unsatisfying amount of website information.

POOR MC & POOR USER EXPERIENCE

  • QUOTE: “This content has many problems: poor spelling and grammar, complete lack of editing, inaccurate information. The poor quality of the MC is a reason for the Lowest+ to Low rating. In addition, the popover ads (the words that are double underlined in blue) can make the main content difficult to read, resulting in a poor user experience.
  • QUOTE: “Pages that provide a poor user experience….. should also receive low ratings, even if they have some images appropriate for the query.

DESIGN FOCUS NOT ON MC

  • QUOTE: “If a page seems poorly designed, take a good look. Ask yourself if the page was deliberately designed to draw attention away from the MC. If so, the Low rating is appropriate.
  • QUOTE: “The page design is lacking. For example, the page layout or use of space distracts from the MC, making it difficult to use the MC.

MC LACK OF AUTHOR EXPERTISE

  • QUOTE: “You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.
  • QUOTE: “There is no evidence that the author has medical expertise. Because this is a YMYL medical article, lacking expertise is a reason for a Low rating.
  • QUOTE: “The author of the page or website does not have enough expertise for the topic of the page and/or the website is not trustworthy or authoritative for the topic. In other words, the page/website is lacking E-A-T.

After page content, the following are given the most weight in determining if you have a high-quality page.

POOR SECONDARY CONTENT

  • QUOTE: “Unhelpful or distracting SC that benefits the website rather than helping the user is a reason for a Low rating.
  • QUOTE: “The SC is distracting or unhelpful for the purpose of the page.
  • QUOTE: “The page is lacking helpful SC.
  • QUOTE: “For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.

DISTRACTING ADVERTISEMENTS

  • QUOTE: “For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits, however, an extremely distracting and graphic porn ad may warrant a Low rating.

GOOD HOUSEKEEPING

  • QUOTE: “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.
  • QUOTE: “The website is lacking maintenance and updates.

SERP SENTIMENT & NEGATIVE REVIEWS

  • QUOTE: “Credible negative (though not malicious or financially fraudulent) reputation is a reason for a Low rating, especially for a YMYL page.
  • QUOTE: “The website has a negative reputation.

LOWEST RATING

When it comes to Google assigning your page the lowest rating, you are probably going to have to go some to hit this, but it gives you a direction you want to ensure you avoid at all costs.

Google says throughout the document, that there are certain pages that…

QUOTE: “should always receive the Lowest rating” Google, 2017

..and these are presented below. Note – These statements below are spread throughout the raters document and not listed the way I have listed them here. I don’t think any context is lost presenting them like this, and it makes it more digestible.

Anyone familiar with Google Webmaster Guidelines will be familiar with most of the following:

  • True lack of purpose pages or websites.
    • Sometimes it is difficult to determine the real purpose of a page.
  • Pages on YMYL websites with completely inadequate or no website information.
  • Pages or websites that are created to make money with little to no attempt to help users.
  • Pages with extremely low or lowest quality MC.
    • If a page is deliberately created with no MC, use the Lowest rating. Why would a page exist without MC? Pages with no MC are usually lack of purpose pages or deceptive pages.
    • Webpages that are deliberately created with a bare minimum of MC, or with MC which is completely unhelpful for the purpose of the page, should be considered to have no MC
    • Pages deliberately created with no MC should be rated Lowest.
    • Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
  • Pages on YMYL (Your Money Or Your Life Transaction pages) websites with completely inadequate or no website information.
  • Pages on abandoned, hacked, or defaced websites.
  • Pages or websites created with no expertise or pages that are highly untrustworthy, unreliable, unauthoritative, inaccurate, or misleading.
  • Harmful or malicious pages or websites.
    • Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.
    • Deceptive pages or websites. Deceptive webpages appear to have a helpful purpose (the stated purpose), but are actually created for some other reason. Use the Lowest rating if a webpage page is deliberately created to deceive and potentially harm users in order to benefit the website.
    • Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
    • Sometimes, pages just don’t “feel” trustworthy. Use the Lowest rating for any of the following: Pages or websites that you strongly suspect are scams
    • Pages that ask for personal information without a legitimate reason (for example, pages which ask for name, birthdate, address, bank account, government ID number, etc.). Websites that “phish” for passwords to Facebook, Gmail…. Pages with suspicious download links, which may be malware.
  • Use the Lowest rating for websites with extremely negative reputations.

Websites ‘Lacking Care and Maintenance’ Are Rated ‘Low Quality’.

QUOTE: “Sometimes a website may seem a little neglected: links may be broken, images may not load, and content may feel stale or out-dated. If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.”

“Broken” or Non-Functioning Pages Classed As Low Quality

Google gives clear advice on creating useful 404 pages:

  • QUOTE: “Tell visitors clearly that the page they’re looking for can’t be found
  • QUOTE: “Use language that is friendly and inviting
  • QUOTE: “Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site.
  • QUOTE: “Consider adding links to your most popular articles or posts, as well as a link to your site’s home page.
  • QUOTE: “Think about providing a way for users to report a broken link.
  • QUOTE: “Make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested

What Are The Low-Quality Signals Google Looks For?

QUOTE: “Low quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well. These pages lack expertise or are not very trustworthy/authoritative for the purpose of the page.” Google Quality Evaluator Guidelines, 2017

These include but are not limited to:

  • Lots of spammy comments
  • Low-quality content that lacks EAT signal (Expertise + Authority + Trust”)
  • NO Added Value for users
  • Poor page design
  • Malicious harmful or deceptive practices detected
  • Negative reputation
  • Auto-generated content
  • No website contact information
  • Fakery or INACCURATE information
  • Untrustworthy
  • Website not maintained
  • Pages just created to link to others
  • Pages lack purpose
  • Keyword stuffing
  • Inadequate customer service pages
  • Sites that use practices Google doesn’t want you to use

Pages can get a neutral rating too.

Pages that have “Nothing wrong, but nothing special” about them don’t “display characteristics associated with a High rating” and puts you in the middle ground – probably not a sensible place to be a year or so down the line.

Pages Can Be Rated ‘Medium Quality’, Too.

QUOTE: “Medium pages achieve their purpose and have neither high nor low expertise, authoritativeness, and trustworthiness. However, Medium pages lack the characteristics that would support a higher quality rating. Occasionally, you will find a page with a mix of high and low quality characteristics. In those cases, the best page quality rating may be Medium.” Google Quality Evaluator Guidelines, 2017

Quality raters will rate content as medium rating when the author or entity responsible for it is unknown.

If you have multiple editors contributing to your site, you had better have a HIGH EDITORIAL STANDARD.

One could take from all this that Google Quality raters are out to get you if you manage to get past the algorithms, but equally, Google quality raters could be friends you just haven’t met yet.

Somebody must be getting rated highly, right?

Impress a Google Quality rater and get a high rating.

If you are a spammer you’ll be pulling out the stops to fake this, naturally, but this is a chance for real businesses to put their best foot forward and HELP quality raters correctly judge the size and relative quality of your business and website.

Real reputation is hard to fake – so if you have it – make sure it’s on your website and is EASY to access from contact and about pages.

The quality raters handbook is a good training guide for looking for links to disavow, too.

It’s pretty clear.

Google organic listings are reserved for ‘remarkable’ and reputable’ content, expertise and trusted businesses.

A high bar to meet – and one that is designed for you to never quite meet unless you are serious about competing, as there is so much work involved.

I think the inferred message is to call your Adwords rep if you are an unremarkable business.

Thin Content Can Still Rank In Google, Sometimes

Ranking top depends on the query and level of competition for the query.

Google’s high-quality recommendations are often for specific niches and specific searches as most of the web would not meet the very highest requirements.

Generally speaking – real quality will stand out, in any niche with a lack of it, at the moment.

The time it takes for this to happen (at Google’s end) leaves a lot to be desired in some niches and time is something Google has an almost infinite supply of compared to 99% of the businesses on the planet.

You do not need a lot of text to rank in Google for most keywords

Google Is Not Going To Rank Low-Quality Pages When It Has Better Options

QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google, 2016

If you have exact match instances of key-phrases on low-quality pages, mostly these pages won’t have all the compound ingredients it takes to rank high in Google.

Google has many quality algorithms these days that demote content and websites that Google deems as providing a lower-quality user experience.

I was working this, long before I understood it partially enough to write anything about it.

Here is an example of taking a standard page that did not rank for years and then turning it into a topic-oriented resource page designed to fully meet a user’s intent:

Graph: Example traffic levels to a properly optimised page

Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.

Google has algorithms that target low-quality content and these algorithms (I surmise) are actually trained in some part by human quality raters.

Quality Raters Do Not Directly Impact YOUR site

QUOTE: “Ratings from evaluators do not determine individual site rankings.” GOOGLE

While Google is on record as stating these quality raters do not directly influence where you rank (without more senior analysts making a call on the quality of your website, I presume?) – there are some things in this document, mostly of a user experience nature (UX) that webmasters should note going forward.

From my own experience, an unsatisfying page-experience signal can impact rankings even on a REPUTABLE DOMAIN and even with SOUND, SATISFYING CONTENT.

It is easy to imagine all these quality ratings are getting passed along to the engineers at Google in some form (at some stage) to improve future algorithms – and identify borderline cases. This is the ‘quality’ bar I’ve mentioned a couple of times in past posts. Google is always raising the bar – always adding new signals, sometimes, in time, taking signals away.

It helps Google:

  1. satisfy their users
  2. control the bulk of transactional web traffic.

That positioning has always been a win-win for Google – and a recognisable strategy from them after all these years.

Take unnatural links out of the equation (which have a history of trumping most other signals) and you are left with page level, site level and off-site signals.

All of these quality signals will need to be addressed to insulate against “Google Panda” (if that can ever be fully successful, against an algorithm that is modified to periodically “ask questions” of your site and overall quality score).

Google holds different types of sites to different standards for different kinds of keywords which would suggest not all websites need all signals satisfied to rank well in SERPs – not ALL THE TIME.

OBSERVATION – You can have the content and the links – but if your site falls short on even a single user satisfaction signal (even if it is picked up by the algorithm, and not a human reviewer) then your rankings for particular terms could collapse – OR – rankings can be held back – IF Google thinks your organisation, with its resources, or ‘reputation, should be delivering a better user experience to users.

OBSERVATION: In the past, a site often rose (in terms of traffic numbers) in Google, before a Panda ‘penalty’. It may be the case (and I surmise this) that the introduction of a certain change initially artificially raised your rankings for your pages in a way that Google’s algorithms do not approve of, and once that problem is spread out throughout your site, traffic begins to deteriorate or is slammed in a future algorithm update.

Google says about the Search Quality Evaluator Guidelines:

QUOTE: “Many websites are eager to tell users how great they are. Some webmasters have read these rating guidelines and write “reviews” on various review websites. But for Page Quality rating, you must also look for outside, independent reputation information about the website. When the website says one thing about itself, but reputable external sources disagree with what the website says, trust the external sources. ” Google Search Quality Evaluator Guidelines 2019

Surely – that’s NOT a bad thing, to make your site HIGHER QUALITY and correctly MARKETING your business to customers – and search quality raters, in the process.

Black Hats will obviously fake all that (which is why it would be self-defeating of me to publish a handy list of signals to manipulate SERPs that’s not just “unnatural links”).

Businesses that care about the performance in Google organic should be noting ALL the following points very carefully.

This isn’t about manipulating quality Raters – it is about making it EASY for them to realise you are a REAL business, with a GOOD REPUTATION, and have a LEVEL of EXPERTISE you wish to share with the world.

The aim is to create a good user experience, not fake it, just as the aim with link building is too not make your links look natural, but for them to be natural.

The Quality Needed To Rank Always Rises

It’s important to note your website quality is often judged on the quality of competing pages for this keyword phrase. This is still a horse race. A lot of this is all RELATIVE to what YOUR COMPETITION are doing.

How relative? Big sites v small sites? Sites with a lot of links v not a lot of links? Big companies with a lot of staff v small companies with a few staff? Do sites at the top of Google get asked more of? Algorithmically and manually? Just…. because they are at the top? Definitely!

Whether its algorithmic or manual – based on technical, architectural, reputation or content – Google can decide and will decide if your site meets its quality requirements to rank on page one.

The likelihood of you ranking stable at number one is almost non-existent in any competitive niche where you have more than a few players aiming to rank number one.

Not en-masse, not unless you are bending the rules.

My own strategy for visibility over the last few years has been to avoid focusing entirely on ranking for particular keywords and rather improve the search experience of my entire website.

QUOTE: “Building a strong site architecture and providing clear navigation will help search engines index your site quickly and easily. This will also, more importantly, provide visitors with a good experience of using your site and encourage repeat visits. It’s worth considering that Google is increasingly paying attention to user experience.” Search Engine Watch, 2016

The entire budget of my time went on content improvement, content re-organisation, website architecture improvement, and lately, mobile experience improvement and CTA (Call-To-Action) optimisation.

I have technical improvements to core web vitals including speed, usability and accessibility in the pipeline.

In simple terms, I took thin content and made it fat to make old content perform better and defend rankings.

Unsurprisingly, ranking fat content comes with its own challenges as the years go by.

Website Reputation Matters

You can help quality raters EASILY research the reputation of your website especially if you have any positive history.

Make “reputation information about the website” easy to access for a quality rater, as judging the reputation of your website is a large part of what they do.

You will need to monitor, or influence, ‘independent’ reviews about your business – because if reviews are particularly negative – Google will “trust the independent sources”.

Consider a page that highlights your good press, if you have any.

  • Google will consider “positive user reviews as evidence of positive reputation.” so come up with a way to get legitimate positive reviews – and starting on Google would be a good place to start.
  • Google states, “News articles, Wikipedia articles, blog posts, magazine articles, forum discussions, and ratings from independent organizations can all be sources of reputation information” but they also state specifically boasts about a lot of internet traffic, for example, should not influence the quality rating of a web page. What should influence the reputation of a page is WHO has shared it on social media etc. rather than just raw numbers of shares. CONSIDER CREATING A PAGE with nofollow links to good reviews on other websites as proof of excellence.
  • Google wants quality raters to examine subpages of your site and often “the URL of its associated homepage” so ensure your homepage is modern, up to date, informative and largely ON TOPIC with your internal pages.
  • Google wants to know a few things about your website, including:
    • Who is moderating the content on the site
    • Who is responsible for the website
    • Who owns copyright of the content
    • Business details (which is important to have synced and accurate across important social media profiles)
    • When was this content updated?
  • Be careful syndicating other people’s content. Algorithmic duplicate problems aside…..if there is a problem with that content, Google will hold the site it finds content on as ‘responsible’ for that content.
  • If you take money online, in any way, you NEED to have an accessible and satisfying ‘customer service’ type page. Google says, “Contact information ….. are extremely important for websites that handle money, such as stores, banks, credit card companies, etc. Users need a way to ask questions or get help when a problem occurs. For shopping websites, we’ll ask you to do some special checks. Look for contact information—including the store’s policies on payment, exchanges, and returns. “ Google urges quality raters to be a ‘detective’ in finding this information about you – so it must be important to them.
  • Keep web pages updated regularly and let users know when the content was last updated. Google wants raters to “search for evidence that effort is being made to keep the website up to date and running smoothly.
  • Google quality raters are trained to be sceptical of any reviews found. It’s normal for all businesses to have mixed reviews, but “Credible, convincing reports of fraud and financial wrongdoing is evidence of extremely negative reputation“.
  • Google asks quality raters to investigate your reputation by searching “giving the example [“ibm.com” reviews –site:ibm.com]: A search on Google for reviews of “ibm.com” which excludes pages on ibm.com.” – So I would do that search yourself and judge for yourself what your reputation is. Very low ratings on independent websites could play a factor in where you rank in the future – ” with Google stating clearly “very low ratings on the BBB site to be evidence for a negative reputation“. Other sites mentioned to review your business include YELP and Amazon. Often – using rich snippets containing schema.org information – you can get Google to display user ratings in the actual SERPs. I noted you can get ‘stars in SERPs’ within two days after I added the code (March 2014).
  • If you can get a Wikipedia page – get one!. Keep it updated too. For the rest of us, we’ll just need to work harder to prove you are a real business that has earned its rankings.
  • If you have a lot of NEGATIVE reviews – expect to be treated as a business with an “Extremely negative reputation” – and back in 2013 – Google mentioned they had an algorithm for this, too. Google has said the odd bad review is not what this algorithm looks for, as bad reviews are a natural part of the web.
  • For quality raters, Google has a Page Quality Rating Scale with 5 rating options on a spectrum of “Lowest, Low, Medium, High, and Highest.”
  • Google says “High-quality pages are satisfying and achieve their purpose well” and has lots of “satisfying” content, written by an expert or authority in their field – they go on to include About Us information” pages, and easy to access “Contact or Customer information pages, etc.
  • Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.
  • Google wants raters to navigate your site and ‘test’ it out to see if it is working. They tell raters to check your shopping cart function is working properly, for instance.
  • Google expects pages to “be edited, reviewed, and updated on a regular basis” especially if they are for important issues like medical information, and states not all pages are held to such standards, but one can expect that Google wants information updated in a reasonable timescale. How reasonable this is, is dependent on the TOPIC and the PURPOSE of the web page RELATIVE to competing pages on the web.
  • Google wants to rank pages by expert authors, not from content farms.
  • You can’t have a great piece of content on a site with a negative reputation and expect it to perform well. A “High rating cannot be used for any website that has a convincing negative reputation.”
  • A very positive reputation can lift your content from “medium” to “high-quality“.
  • Google doesn’t care about ‘pretty‘ over substance and clearly instructs raters to “not rate based on how “nice” the page looks“.
  • Just about every webpage should have a CLEAR way to contact the site manager to achieve a high rating.
  • Highlighting ads in your design is BAD practice, and Google gives clear advice to rate the page LOW – Google wants you to optimise for A SATISFYING EXPERIENCE FIRST, CONVERSION SECOND!
  • Good news for many industries! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
  • If your intent is to deceive, be malicious or present pages with no purpose other than to monetise free traffic with no value ad – Google is not your friend.
  • Domains that are ‘related’ in Whois can lead to a low-quality score, so be careful how you direct people around multiple sites you own.
  • Keyword stuffing your pages is not recommended, even if you do get past the algorithms.
  • Quality raters are on the lookout for content that is “copied with minimal alteration” and crediting the original source is not a way to get around this. Google rates this type of activity low-quality.
  • How can Google trust a page if it is blocked from it or from reading critical elements that make up that page? Be VERY careful blocking Google from important directories (blocking CSS and .js files are very risky these days). REVIEW your ROBOTS.txt and know exactly what you are blocking and why you are blocking it.

What Are The High-Quality Characteristics of a Web Page?

QUOTE: “High quality pages exist for almost any beneficial purpose…. What makes a High quality page? A High quality page should have a beneficial purpose and achieve that purpose well.” Google Search Quality Evaluator Guidelines, 2019

The following are examples of what Google calls ‘high-quality characteristics’ of a page and should be remembered:

  • A satisfying or comprehensive amount of very high-quality” main content (MC)
  • Copyright notifications up to date
  • Functional page design
  • Page author has Topical Authority
  • High-Quality Main Content
  • Positive Reputation or expertise of website or author (Google yourself)
  • Very helpful SUPPLEMENTARY content “which improves the user experience.
  • Trustworthy
  • Google wants to reward ‘expertise’ and ‘everyday expertise’ or experience so you need to make this clear (perhaps using an Author Box or some other widget)
  • Accurate information
  • Ads can be at the top of your page as long as it does not distract from the main content on the page
  • Highly satisfying website contact information
  • Customised and very helpful 404 error pages
  • Awards
  • Evidence of expertise
  • Attention to detail

If Google can detect investment in time and labour on your site – there are indications that they will reward you for this (or at least – you won’t be affected when others are, meaning you rise in Google SERPs when others fall).

What Characteristics Do The Highest Quality Pages Exhibit?

QUOTE: “The quality of the MC is one of the most important criteria in Page Quality rating, and informs the E-A-T of the page. For all types of webpages, creating high quality MC takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. For news articles and information pages, high quality MC must be factually accurate for the topic and must be supported by expert consensus where such consensus exists.” Google Search Quality Evaluator Guidelines, 2019

You obviously want the highest quality ‘score’ possible but looking at the Search Quality Evaluator Guidelines that is a lot of work to achieve.

Google wants to rate you on the effort you put into your website, and how satisfying a visit is to your pages.

  • QUOTE: Very high or highest quality MC, with demonstrated expertise, talent, and/or skill.
  • QUOTE: “Very high level of expertise, authoritativeness, and trustworthiness (page and website) on the topic of the page.”
  • QUOTE: “Very good reputation (website or author) on the topic of the page.”

At least for competitive niches were Google intend to police this quality recommendation, Google wants to reward high-quality pages and “the Highest rating may be justified for pages with a satisfying or comprehensive amount of very high-quality” main content.

If your main content is very poor, with “grammar, spelling, capitalization, and punctuation errors“, or not helpful or trustworthy – ANYTHING that can be interpreted as a bad user experience – you can expect to get a low rating.

QUOTE: “The quality of the MC is an important consideration for PQ rating. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well…. Important: The Low rating should be used if the page has Low quality MC. ” Google Search Quality Evaluator Guidelines, 2019

Note – not ALL “content-light” pages are automatically classed low-quality.

If you can satisfy the user with a page “thin” on text content – you are ok (but probably susceptible to someone building a better page than your, more easily, I’d say).

Google expects more from big brands than they do from a smaller store (but that does not mean you shouldn’t be aiming to meet ALL these high-quality guidelines above).

Note too, that if you violate Google Webmaster recommendations for performance in their indexes of the web – you automatically get a low-quality rating.

If your page has a sloppy design, low-quality main content and too many distracting ads your rankings are very probably going to take a nose-dive.

If a Search Quality Evaluator is subject to a sneaky redirect they are instructed to rate your site low.

Google Quality Algorithm Updates

QUOTE: “We have 3 updates a day in average. I think it’s pretty safe to assume there was one recently…” Gary Illyes, Google 2017

Google has many algorithm updates during a year.

These ‘quality updates’ are very reminiscent of Google Panda updates and often impact many websites at the same time – and often these focus on demoting similar ‘low-quality’ techniques we have been told Panda focuses on.

Usually, Google has 3 or 4 big updates in a year that focus on various things, but they also make changes daily. See this list for a comprehensive list of Google algorithm updates.

QUOTE: “Yes, we do make small changes each day to search. It’s a continual process of improvement. Our core updates only happen 2-4 times per year.Danny Sullivan, Google 2020

Note also Google often releases multiple updates and changes to its own GUI (Graphic User Interface – the actual SERPs) at the same time to keep us all guessing to what is going on (and we do).

What Is Google Panda?

QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google, 2016

Google Panda aims to rate the quality of your pages and website and is based on things about your site that Google can rate, or algorithmically identify.

QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.”  Gary Illyes, Google 2016

We are told the current Panda is an attempt to basically stop low-quality thin content pages ranking for keywords they shouldn’t rank for.

Panda evolves – signals can come and go – Google can get better at determining quality as a spokesman from Google has confirmed :

QUOTE: So it’s not something where we’d say, if your website was previously affected, then it will always be affected. Or if it wasn’t previously affected, it will never be affected.… sometimes we do change the criteria…. category pages…. (I) wouldn’t see that as something where Panda would say, this looks bad.… Ask them the questions from the Panda blog post….. usability, you need to work on.“ John Mueller, Google.

From my original notes about Google Penguin, I list the original, somewhat abstract and metaphorical Panda ranking ‘factors’ published as a guideline for creating high-quality pages.

I also list these Panda points below (Google employees still publicly refer to this document):

(PS – I have emphasised two of the bullet points below, at the top and bottom because I think it’s easier to understand these points as a question, how to work that question out, and ultimately, what Google really cares about – what their users think.

  • “Would you trust the information presented in this article? (YES or NO)
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature? EXPERTISE
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? LOW-QUALITY CONTENT/THIN CONTENT
  • Would you be comfortable giving your credit card information to this site? (HTTPS? OTHER TRUST SIGNALS (CONTACT/ABOUT / PRIVACY / COPYRIGHT / DISCLOSURES / DISCLAIMERS etc. especially relevant if your site is a YMYL page.)
  • Does this article have spelling, stylistic, or factual errors? (SPELLING + GRAMMAR + CONTENT QUALITY – perhaps wrong dates in content, on old articles, for instance)
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines? (OLD TACTICS|DOORWAY PAGES)
  • Does the article provide original content or information, original reporting, original research, or original analysis? (UNIQUE CONTENT, ORIGINAL RESEARCH & SATISFYING CONTENT)
  • Does the page provide substantial value when compared to other pages in search results? (WHAT’S THE RELATIVE QUALITY OF COMPETITION LIKE FOR THIS TERM?)
  • How much is quality control done on content? (WHEN WAS THIS LAST EDITED? Is CONTENT OUTDATED? IS SUPPLEMENTARY CONTENT OUTDATED (External links and images?))
  • Does the article describe both sides of a story? (IS THIS A PRESS RELEASE?)
  • Is the site a recognized authority on its topic? (EXPERTISE, AUTHORITY, TRUST)
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care? (IS THIS CONTENT BOUGHT FROM A $5 per article content factory? Or is written by an EXPERT or someone with a lot of EXPERIENCE of the subject matter?)
  • Was the article edited well, or does it appear sloppy or hastily produced? (QUALITY CONTROL on EDITORIALS)
  • For a health related query, would you trust information from this site? (EXPERTISE NEEDED)
  • Would you recognize this site as an authoritative source when mentioned by name? (EXPERTISE NEEDED)
  • Does this article provide a complete or comprehensive description of the topic? (Is the page text designed to help a visitor or shake them down for their cash?)
  • Does this article contain insightful analysis or interesting information that is beyond obvious? (LOW QUALITY CONTENT – You know it when you see it)
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend? (Would sharing this page make you look smart or dumb to your friends? This should be reflected in social signals)
  • Does this article have an excessive amount of ads that distract from or interfere with the main content? (OPTIMISE FOR SATISFACTION FIRST – CONVERSION SECOND – do not let the conversion get in the way of satisfying the INTENT of the page. For example – if you rank with INFORMATIONAL CONTENT with a purpose to SERVE those visitors – the visitor should land on your destination page and not be deviated from the PURPOSE of the page – and that was informational, in this example – to educate. SO – educate first – beg for social shares on those articles – and leave the conversion on Merit and slightly more subtle influences rather than massive banners or whatever that annoy users). We KNOW ads (OR DISTRACTING CALL TO ACTIONS) convert well at the top of articles – but Google says it is sometimes a bad user experience. You run the risk of Google screwing with your rankings as you optimise for conversion so be careful and keep everything simple and obvious.
  • Would you expect to see this article in a printed magazine, encyclopedia or book? (Is this a HIGH-QUALITY article?)… no? then expect ranking problems.
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics? (Is this a LOW or MEDIUM QUALITY ARTICLE? LOW WORD COUNTS ACROSS PAGES?)
  • Are the pages produced with great care and attention to detail vs. less attention to detail? (Does this page impress?)
  • Would users complain when they see pages from this site? (WILL THIS PAGE MAKE GOOGLE LOOK STUPID IF IT RANKS TOP?)”

All that sits quite nicely with the information you can read in the Google Search Quality Evaluator Guidelines.

If you fail to meet these standards (even some) your rankings can fluctuate wildly (and often, as Google updates Panda every month we are told and often can spot rolling in).

It all probably correlates quite nicely too, with the type of sites you don’t want links from.

QUOTE: “I think there is probably a misunderstanding that there’s this one site-wide number that Google keeps for all websites and that’s not the case.  We look at lots of different factors and there’s not just this one site-wide quality score that we look at. So we try to look at a variety of different signals that come together, some of them are per page, some of them are more per site, but it’s not the case where there’s one number and it comes from these five pages on your website.” John Mueller, Google 2016

Google is raising the quality bar, and forcing content creators to spend HOURS, DAYS or WEEKS longer on websites if they ‘expect’ to rank HIGH in natural results.

If someone is putting the hours into rank their site through legitimate efforts – Google will want to reward that – because it keeps the barrier to entry HIGH for most other competitors.

Critics will say the higher the barrier to entry is to rank high in Google natural listings the more attractive Google Adwords begins to look to those other businesses.

Google says a quality rater does not affect your site, but if your site gets multiple LOW-QUALITY notices from manual reviewers – that stuff is coming back to get you later, surely.

Example ‘High Quality’ E-commerce Site

Google has the search quality rating guidelines. After numerous ‘leaks’, this previously ‘secretive’ document has now been made available for anyone to download.

This document gives you an idea of the type of quality websites Google wants to display in its search engine results pages.

I use these quality rating documents and the Google Webmaster Guidelines as the foundation of my audits for e-commerce sites.

What are these quality raters doing?

Quality Raters are rating Google’s ‘experiments’ and manually reviewing web pages that are presented to them in Google’s search engine results pages (SERPs). We are told that these ratings don’t impact your site, directly.

QUOTE: “Ratings from evaluators do not determine individual site rankings, but are used help us understand our experiments. The evaluators base their ratings on guidelines we give them; the guidelines reflect what Google thinks search users want.” GOOGLE.

What Does Google class as a high-quality product page on an e-commerce site?

This page and site appear to check all the boxes Google wants to see in a high-quality e-commerce website these days.

This product page is an example of YMYL page exhibiting “A satisfying or comprehensive amount of very high-quality MC (main content)” and “Very high level of expertise, highly authoritative/highly trustworthy for the purpose of the page” with a “Very positive reputation“.

Highest Quality Rating: Shopping (YMYL Page Example)

What Is “Domain Authority“?

QUOTE: “So domain authority….. it’s not really something that we have here at Google.” John Mueller, Google 2017

Domain authority, whether or not is something Google has or not, is an important concept to take note of. Essentially Google ‘trusts’ some websites more than others and you will find that it is easier to rank using some websites than it is others.

Google calls it, ‘online business authority

QUOTE: “Amazon has a lot of “online business authority””…. (Official Google Webmaster Blog)

We conveniently call this effect ‘domain authority’ and it seemed to be related to ‘PageRank’ – the system Google started to rank the web within 1998.

QUOTE: “PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B.” Google

Domain authority is an important ranking phenomenon in Google. Nobody knows exactly how Google calculates, ranks and rates the popularity, reputation, intent or trust of a website, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted e.g. are also cited by popular, reputable and trusted sites.

QUOTE: “Last year, we studied almost one billion web pages and found a clear correlation between referring domains (links from unique websites) and organic search traffic.Joshua Hardwick, AHrefs, 2020

Historically it is a useful metaphor and proxy for quality and sometimes you can use it to work out the likelihood of a site ranking for a particular keyword based on its relative score when compared to competing sites and pages.

Historically sites that had domain authority or online business authority had lots of links to them, hence why link building was so popular a tactic – and counting these links is generally how most 3rd parties still calculate a pseudo domain authority score for websites today.

Massive domain authority and ranking ‘trust’ was awarded to very successful sites that had gained a lot of links from credible sources, and other online business authorities too.

We more usually talk about domain trust and domain authority based on the number, type and quality of incoming links to a site.

Examples of trusted, authority domains include Wikipedia, the W3C and Apple. e.g.: these are very successful brands.

How did you take advantage of being an online business authority 10 years ago? You turned the site into an Pagerank Black Hole to horde the benefits of “domain authority” and published lots of content sometimes with little thought to quality.

On any subject.

Because Google would rank it!

I think this ‘quality score’ Google has developed since at least 2010 could be Google’s answer to this sort of historical domain authority abuse.

QUOTE: “And we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons…Matt Cutts, Google 2011

and

QUOTE: “The score is determined from quantities indicating user actions of seeking out and preferring particular sites and the resources found in particular sites. *****A site quality score for a particular site**** can be determined by computing a ratio of a numerator that represents user interest in the site as reflected in user queries directed to the site and a denominator that represents user interest in the resources found in the site as responses to queries of all kinds The site quality score for a site can be used as a signal to rank resources, or to rank search results that identify resources, that are found in one site relative to resources found in another site.” Navneet Panda, Google, 2015

An effect akin to ‘domain authority’ is still visible but this new phenomenon is probably based on-site quality scores, potential authorship value scores, user interest and other classifiers, as well as Pagerank.

QUOTE: “The best thing that you can do here is really make sure that the rest of your website is really as high quality and as fantastic as possible. Because if we see that everything else on your website is really fantastic, then when we find something new, we’ll say, well, probably this is pretty good, too.” John Mueller, Google 2018

This takes a lot of work and a lot of time to create, or even mimic, such a site.

QUOTE: “Brands are the solution, not the problem, Brands are how you sort out the cesspool.” Eric Schmidt, Google 2008

Google is going to present users with sites that are recognisable to them. If you are a ‘brand’ in your space or well-cited site, Google wants to rank your stuff at the top as it won’t make Google look stupid.

Getting links from ‘Brands’ (or well-cited websites) in niches can also mean getting  ‘quality links’.

Easier said than done, for most, of course, but that is the point of link building – to get these types of links – the type of links Google will not ignore:

QUOTE: “Our algorithms will probably say well all of the links are links that the person place themselves so maybe we should kind of ignore those links.” John Mueller, Google 2020

Does Google Prefer Big Brands In Organic SERPs?

Well, yes. It’s hard to imagine that a system like Google’s was not designed exactly over the last few years to deliver the listings it does today – and it is often filled even with content that ranks high likely because of the domain the content is on.

Big brands also find it harder to take advantage of ‘domain authority’. Its harder for most businesses because low-quality content on parts of a domain can negatively impact the rankings of an entire domain.

QUOTE: “I mean it’s kind of like we look at your web site overall. And if there’s this big chunk of content here or this big chunk kind of important wise of your content, there that looks really iffy, then that kind of reflects across the overall picture of your website. ”  John Mueller, Google

Google has introduced (at least) a ‘percieved’ risk to publishing lots of lower-quality pages on your site to in an effort to curb production of keyword-stuffed  content based on manipulating early search engine algorithms.

We are dealing with new algorithms designed to target old style tactics and that focus around the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of KEYWORDS equals LOTS of Google traffic.

A big site can’t just get away with publishing LOTS of lower quality content in the cavalier way they used to – not without the ‘fear’ of primary content being impacted and organic search traffic throttled negatively to important pages on the site.

Google still uses links and something akin to the original PageRank:

QUOTE: “DYK that after 18 years we’re still using* PageRank (and 100s of other signals) in ranking?” Gary Illyes, Google 2017

Google is very probably also using user metrics in some way to determine the ‘quality’ of your site:

QUOTE: “I think that’s always an option. Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.” John Mueller, Google 

Your online reputation is evidently calculated by Google from many metrics.

Small businesses can still build this ‘domain reputation’ over time if you focus on a content strategy based on depth and quality, rather than breadth when it comes to how content is published on your website.

I did, on my own site.

Instead of publishing LOTS of pages, focus on fewer pages that are of high quality. You can better predict your success in ranking for the long term for a particular keyword phrase this way.

Then you rinse and repeat.

Failure to meet these standards for quality content may impact rankings noticeably around major Google quality updates.

Is Domain Age An Important Google Ranking Factor?

No, not in isolation.

Having a ten-year-old domain that Google knows nothing about is almost the same as having a brand new domain.

A 10-year-old site that’s continually cited by, year on year, the actions of other, more authoritative, and trusted sites? That’s valuable.

But that’s not the age of your website address ON ITS OWN in-play as a ranking factor.

A one-year-old domain cited by authority sites is just as valuable if not more valuable than a ten-year-old domain with no links and no search-performance history.

Perhaps Domain age may come into play when other factors are considered – but I think Google works very much like this on all levels, with all ‘Google ranking factors’, and all ranking ‘conditions’.

I don’t think you can consider discovering ‘ranking factors’ without ‘ranking conditions’.

Other Ranking Factors:

  • Domain age; (NOT ON ITS OWN)
  • Length of site domain registration; (I don’t see much benefit ON IT”S OWN even knowing “Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year.”) – paying for a domain in advance just tells others you don’t want anyone else using this domain name, it is not much of an indication that you’re going to do something Google cares about).
  • Domain registration information was hidden/anonymous; (possibly, under human review if OTHER CONDITIONS are met like looking like a spam site)
  • Site top level domain (geographical focus); (YES)
  • Site top level domain (e.g. .com versus .info); (DEPENDS)
  • Subdomain or root domain? (DEPENDS)
  • Domain past records (how often it changed IP); (DEPENDS)
  • Domain past owners (how often the owner was changed) (DEPENDS)
  • Keywords in the domain; (DEFINITELYESPECIALLY EXACT KEYWORD MATCH – although Google has a lot of filters that mute the performance of an exact match domain))
  • Domain IP; (DEPENDS – for most, no)
  • Domain IP neighbours; (DEPENDS – for most, no)
  • Domain external mentions (non-linked citations) (perhaps)
  • Geo-targeting settings in Google Search Console (YES – of course)

On-Page; Create Compelling, Unique Content

  • Only add unique hand-written content to your web pages
  • Don’t add content to your pages that is found verbatim on other pages on the web
  • Do not ‘spin‘ text on pages on your site
  • Don’t create thin-pages with little value add to users
  • Implement the rel=”canonical” link element on all pages on your site to minimise duplicate content
  • Apply 301 permanent redirects where necessary
  • Use the URL parameter handling tool in Google Search Console where necessary
  • Reduce Googlebot crawl expectations.
  • Consolidate ranking equity & potential in high-quality canonical pages
  • Minimize boilerplate repetition across your site
  • Don’t block Google from your duplicate content, just manage it better
  • Sites with separate mobile URLs should just move to a responsive design
  • Canonical Link Elements can be ignored by Google
  • Avoid publishing stubs for text “coming soon”
  • Minimize similar content on your pages
  • Do NOT canonicalise component pages in a series to the first page.
  • Google don’t use link-rel-next/prev at all. You can use it if you want, but don’t rely on it alone.
  • Translated content is not duplicate content.
  • Block Google from crawling thin internal search results pages.
  • Check for content on your site that duplicates content found elsewhere.
  • An easy way to find duplicate content is to use Google search using “quotes”
  • Use Google Search Console to fix duplicate content issues on your site.

QUOTE: “Duplicated content is often not manipulative and is commonplace on the web and often free from malicious intent. It is not penalised, but it is not optimal. Copied content can often be penalised algorithmically or manually. Don’t be ‘spinning’ ‘copied’ text to make it unique! Shaun Anderson, Hobo 2020

Google says:

QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors.” Google, 2017

There is no one particular way to create web pages that successfully rank in Google but you must ensure:

QUOTE: “that your content kind of stands on its own” John Mueller, Google 2015

If you have an optimised platform on which to publish it, high-quality content is the number 1 user experience area to focus on across websites to earn traffic from Google.

If you have been impacted by Google’s content quality algorithms, your focus should be on ‘improving content’ on your site rather than deleting content:

Screenshot 2016-05-25 19.32.25

 

SEO Copywriting

SEO copywriting is the art of writing high-quality content for search engines on-page in copy, in page titles, meta descriptions, SERP snippets and SERP featured snippets. It is not about keyword stuffing text.

Google Has Evolved and Content Marketing With It

QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google

Google does not work only the way it used to work, and as a result, this impacts a lot of websites built a certain way to rank high in Google – and Google is a lot less forgiving these days.

Is the user going to be more satisfied with an exact match query on a low-quality website, OR a high-quality page closely related to the search query used, published by an entity Google trusts and rates highly?

Google is deciding more and more to go with the latter.

Optimisation must not get in the way of the main content of a page or negatively impact user experience.

Focus on ‘Things’, Not ‘Strings’

QUOTE: “We’ve been working on an intelligent model—in geek-speak, a “graph”—that understands real-world entities and their relationships to one another: things, not strings” Google 2012

Google is better at working out what a page is about, and what it should be about to satisfy the intent of a searcher, and it isn’t relying only on keyword phrases on a page to do that anymore.

Google has a Knowledge Graph populated with NAMED ENTITIES and in certain circumstances, Google relies on such information to create SERPs.

QUOTE: “The Knowledge Graph enables you to search for things, people or places that Google knows about—landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more—and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.” Amit Singhal, Google 2012

Google has plenty of options when rewriting the query in a contextual way, based on what you searched for previously, who you are, how you searched and where you are at the time of the search.

On-Page; How Much Text Do You Need To Write For Google?

How much text do you put on a page to rank for a certain keyword?

Well, as in so much of SEO theory and strategy, there is no optimal amount of text per page, and it is going to differ, based on the topic, and content type, and SERP you are competing in.

Instead of thinking about the quantity of the Main Content (MC) text, you should think more about the quality of the content on the page. Optimise this with searcher intent in mind.

There is no minimum amount of words or text to rank in Google. I have seen pages with 50 words out-rank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other ‘strategy’. Google is a lot better at hiding away those pages, though.

At the moment, I prefer long-form pages and a lot of text, still focused on a few related keywords and keyphrases to a page. Useful for long tail key phrases and easier to explore related terms.

Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a user’s search query.

I don’t care how many words I achieve this with and often I need to experiment with a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google.

One thing to note – the more text you add to the page, as long as it is unique, keyword rich and relevant to the topic, the more that page will be rewarded with more visitors from Google.

There is no optimal number of words on a page for placement in Google.

Every website – every page – is different from what I can see. Don’t worry too much about word count if your content is original and informative. Google will probably reward you on some level – at some point – if there is lots of unique text on all your pages.

Google said there is no minimum word count when it comes to gauging content quality.

QUOTE: “There’s no minimum length, and there’s no minimum number of articles a day that you have to post, nor even a minimum number of pages on a website. In most cases, quality is better than quantity. Our algorithms explicitly try to find and recommend websites that provide content that’s of high quality, unique, and compelling to users. Don’t fill your site with low-quality content, instead work on making sure that your site is the absolute best of its kind.” John Mueller Google, 2014

However, the quality rater’s guide does state:

6.2 Unsatisfying Amount of Main Content

Some Low quality pages are unsatisfying because they have a small amount of MC for the purpose of the page. For example, imagine an encyclopedia article with just a few paragraphs on a very broad topic such as World War II. Important: An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating.

On-Page; Do Keywords In Bold Or Italic Help?

QUOTE: “You’ll probably get more out of bolding text for human users / usability in the end. Bots might like, but they’re not going to buy anything.” John Mueller, Google 2017

Some webmasters claim putting your keywords in bold or putting your keywords in italics is a beneficial ranking factor in terms of search engine optimizing a page.

It is essentially impossible to test this, and I think these days, Google could well be using this (and other easy to identify on page optimisation efforts) to determine what to punish a site for, not promote it in SERPs.

Any item you can ‘optimise’ on your page – Google can use this against you to filter you out of results.

I use bold or italics these days specifically for users.

I only use emphasis if it’s natural or this is really what I want to emphasise!

Do not tell Google what to filter you for that easily.

I think Google treats websites they trust far different to others in some respect.

That is, more trusted sites might get treated differently than untrusted sites.

Keep it simple, natural, useful and random.

On-Page; Do You Need Lots of Text To Rank Pages In Google?

QUOTE: “Nobody at Google counts the words on a page. Write for your users.” John Mueller, Google 2019

NO, but:

QUOTE: “You always need textual content on-page, regardless of what other kinds of content you might have. If you’re a video-hosting site, you still need things like titles, headings, text, links, etc. The same goes for audio-hosting sites. Make it easy for search engines to understand your content & how it’s relevant to users, and they’ll be able to send you relevant traffic. If you make it hard for search engines to figure out what your pages are about, it would be normal for them to struggle to figure out how your site is relevant for users.” John Mueller, Google 2019

On-Page; Can I Just Write Naturally and Rank High in Google?

QUOTE: “There’s nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.” Danny Sullivan, Google 2019

Yes, you must write naturally (and succinctly) but if you have no idea the keywords you are targeting, and no expertise in the topic, you will be left behind those that can access this experience.

You can just ‘write naturally’ and still rank, albeit for fewer keywords than you would have if you optimised the page.

There are too many competing pages targeting the top spots not to optimise your content.

Naturally, how much text you need to write, how much you need to work into it, and where you ultimately rank, is going to depend on the domain reputation of the site you are publishing the article on.

On-Page; Optimising For ‘The Long Click’

When it comes to rating user satisfaction, there are a few theories doing the rounds at the moment that I think are sensible. Google could be tracking user satisfaction by proxy. When a user uses Google to search for something, user behaviour from that point on can be a proxy of the relevance and relative quality of the actual SERP.

What is a Long Click?

A user clicks a result and spends time on it, sometimes terminating the search.

What is a Short Click?

A user clicks a result and bounces back to the SERP, pogo-sticking between other results until a long click is observed. Google has this information if it wants to use it as a proxy for query satisfaction.

For more on this, I recommend this article on the time to long click.

Note; Ranking could be based on a ‘duration metric’

QUOTE: “The average duration metric for the particular group of resources can be a statistical measure computed from a data set of measurements of a length of time that elapses between a time that a given user clicks on a search result included in a search results web page that identifies a resource in the particular group of resources and a time that the given user navigates back to the search results web page. …Thus, the user experience can be improved because search results higher in the presentation order will better match the user’s informational needs.” High Quality Search Results based on Repeat Clicks and Visit Duration. Bill Slawski, Go Fish Digital, 2017

Note; Rankings could be based on a ‘duration performance score

QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations. Bill Slawski, Go Fish Digital, 2017

Google has many patents that help it determine the quality of a page. Nobody knows for sure which patents are implemented into the core search algorithms.

If you are focused on delivering high quality information, you can avoid the worst of these algorithms.

On-Page; Does Your Page Content Satisfy User Search Intentions?

User search intent is a way marketers describe what a user wants to accomplish when they perform a Google search.

SEOs have understood user search intent to fall broadly into the following categories and there is an excellent post on Moz, 2016 about this.

  1. Transactional – The user wants to do something like buy, signup, register to complete a task they have in mind.
  2. Informational – The user wishes to learn something
  3. Navigational – The user knows where they are going

The Google human quality rater guidelines modify these to simpler constructs:

  • Do 
  • Know
  • Go

SO – how do you optimise for all this?

You could rely on old-school SEO techniques, but Google doesn’t like thin pages, and you need higher quality unnatural links to power low-quality sites these days. That is all a risky investment.

Google has successfully made that way forward a minefield for smaller businesses.

A safer route with a guaranteed ROI, for a real business who can’t risk spamming Google, is to focus on satisfying user satisfaction signals Google might be rating favourably.

You do this by focusing on meeting exactly the intent of an individual  keyword query

How. Why and Where from a user searches are going to be numerous and ambiguous and this is an advantage for the page that balances this out better than competing pages in SERPs.

‘Know’

High-quality copywriting is not an easy ‘ask’ for every business, but it is a tremendous leveller.

Anyone can teach what they know and put it on a website if the will is there.

Some understand the ranking benefits of in-depth, curated content, for instance, that helps a user learn something. In-depth pages or long-form content is a magnet for long-tail key phrases.

The high-quality text content of any nature is going to do well, in time, and copywriters should rejoice.

Copywriting has never been more important.

Offering high-quality content is a great place to start on your site.

It’s easy for Google to analyse and rate,  and it is also a sufficient barrier to entry for most competitors (at least, it was in the last few years).

Google is looking for high-quality content:

High quality pages and websites need enough expertise to beauthoritative and trustworthy on their topic.”

..or if you want it another way, Google’s algorithms target low-quality content.

But what if you can’t write to satisfy these KNOW satisfaction metrics?

Luckily – you do not need lots of text to rank in Google.

‘Go’

When a user is actively seeking your page out and selects your page in the SERP, they are probably training Google AI to understand this is a page on a site that satisfies the user intent.  This user behaviour is where traditional media and social media promotion is going to be valuable if you can get people to search your site out. This is one reason you should have a short, memorable domain or brand name if you can get one.

So, users should be using Google to seek your site out e.g. “Hobo SEO“.

Do’ Beats ‘Know.’

If you can’t display E.A.T. in your writing, you can still rank if you satisfy users who do search that query.

————-

Last year I observed Google rank a page with 50 words of text on it instead of a page with 5000 words and lots of unique images that target the same term on the same domain.

While there might be something at fault with the ‘optimised’ 5000-word page I have overlooked, the main difference between the two pages was time spent on the page and task completion ‘rate’.

I’ve witnessed Google flip pages on the same domain for many reasons, But it did get me thinking perhaps that Google is thinking users are more satisfied with the DO page (an online tool)  with better task completion metrics than the KNOW page (a traditional informational page).

In the end, I don’t need to know why Google is flipping the page, just that it is.

So that means that you don’t always need ‘text-heavy content’ to rank for a term.

You never have of course.

Popular pages that pick up links and shares have, and always will rank high.

I only offer one example I’ve witnessed Google picking the DO page over the KNOW page, and it surprised me when it did.

It has evidently surprised others too.

There is a post on Searchmetrics that touches on pages with only a little text-content ranking high in Google:

QUOTE: “From a classical SEO perspective, these rankings can hardly be explained. There is only one possible explanation: user intent. If someone is searching for “how to write a sentence” and finds a game such as this, then the user intention is fulfilled. Also the type of content (interactive game) has a well above average time-on-site.” SearchMetrics 2016

That’s exactly what I think I have observed, too, though I wouldn’t ever say there is only ‘one possible explanation‘ to anything to do with Google.

For instance – perhaps other pages on the site help the page with no content rank, but when it comes to users being satisfied, Google shows the page with better usage statistics instead, because it thinks it is a win for everyone involved.

This is speculation, of course, and I have witnessed Google flipping pages in SERPs if they have a problem with one of them, for instance, for years.

This is news in January 2016 but I saw it last year and some time ago now. It isn’t entirely ‘new’ to the wild but might be more noticeable in more niches.

How Much Text Do You Need To Rank?

None, evidently, if you can satisfy the query in an unusual manner without the text.

On-Page; Optimise For User Intent & Satisfaction

Graph: Traffic performance of a page in Google

QUOTE: “Basically you want to create high-quality sites following our webmaster guidelines, and focus on the user, try to answer the user, try to satisfy the user, and all eyes will follow.” Gary Illyes, Google 2016

When it comes to writing SEO-friendly text for Google, we must optimise for user intent, not simply what a user typed into Google.

Google will send people looking for information on a topic to the highest quality, relevant pages it knows about, often BEFORE it relies on how Google ‘used‘ to work e.g. relying on finding near or exact match instances of a keyword phrase on any one page, regardless of the actual ‘quality’ of that page.

Google is constantly evolving to better understand the context and intent of user behaviour, and it doesn’t mind rewriting the query used to serve high-quality pages to users that more comprehensively deliver on user satisfaction e.g. explore topics and concepts in a unique and satisfying way.

Of course, optimising for user intent, even in this fashion, is something a lot of marketers had been doing long before query rewriting and  Google Hummingbird came along.

QUOTE: “Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.” Danny Sullivan, Google 2013

Long Tail Traffic Is Hiding In Long-Form Content

Screenshot 2015-03-24 01.44.48

Google didn’t kill the long tail of traffic, though since 2010 they have been reducing the amount of such traffic they will send to certain sites.

In part, they shifted a lot of long tail visitors to pages that Google thought may satisfy their user query, RATHER than just rely on particular exact and near match keywords repeated on a particular page.

At the same time, Google was hitting old school SEO tactics and particularly thin or overlapping pages. So – an obvious strategy and one I took was to identify the thin content on a site and merge it into long-form content and then rework that to bring it all up-to-date.

Long-form content is a magnet for long tail searches and helps you rank for much more popular (head) keywords. The more searchers and visitors you attract, the more you can ‘satisfy’ and the better chance you can rank higher in the long run.

Do you NEED long-form pages to rank?

No – but it can be very useful as a base to start a content marketing strategy if you are looking to pick up links and social media shares.

And be careful. The longer a page is, the more you can dilute it for a specific keyword phrase, and it’s sometimes a challenge to keep it updated.

Google seems to penalise stale or unsatisfying content.

How To SEO In-depth Content

From a strategic point of view, if you can explore a topic or concept in an in-depth way you must do it before your competition. Especially if this is one of the only areas you can compete with them on.

Here are some things to remember about creating topic-oriented in-depth content:

  • In-depth content needs to be kept updated. Every six months, at least. If you can update it a lot more often than that – it should be updated more
  • In-depth content can reach tens of thousands of words, but the aim should always be to make the page concise as possible, over time
  • In-depth content can be ‘optimised’ in much the same way as content has always been optimised
  • In-depth content can give you authority in your topical niche
  • Pages must MEET THEIR PURPOSE WITHOUT DISTRACTING ADS OR CALL TO ACTIONS. If you are competing with an information page – put the information FRONT AND CENTRE. Yes – this impacts negatively on conversions in the short term. BUT – these are the pages Google will rank high. That is – pages that help users first and foremost complete WHY they are on the page (what you want them to do once you get them there needs to be of secondary consideration when it comes to Google organic traffic).
  • You need to balance conversions with user satisfaction unless you don’t want to rank high in Google.

Optimising For Topics And Concepts

Old SEO was, to a large extent, about repeating text. New SEO is about user satisfaction.

Google’s punishment algorithms designed to target SEO are all over that practice these days. And over a lot more, besides.

  • Google’s looking for original text on a subject matter that explores the concept that the page is about, rather than meets keyword relevance standards of yesteryear.
  • If your page rings these Google bells in your favour, Google knows your page is a good resource on anything to do with a particular concept – and will send visitors to it after invisibly rewriting the actual search query that the user made. Google is obfuscating the entire user intent journey.

For us at the receiving end, it all boils down to writing content that meets a specific user intent and does it better than competing pages.

We are not trying to beat Google or RankBrain, just the competition.

Pages looking to, genuinely, help people are a good user experience. At the page level, satisfying informational search intent is still going to be about keyword analysis at some level.

SEO is about understanding topics and concepts as search engines try to.

A well-optimised topic/concept oriented page that meets high relevance signals cannot really fail to pick up search traffic and, if it’s useful to people, pick up UX signals that will improve rankings in Google (I include links in that).

Is Poor Grammar A Google Ranking Factor?

NO – this is evidently NOT a ranking signal. I’ve been blogging for nine years and most complaints I’ve had in that time have been about my poor grammar and spelling in my posts.

My spelling and grammar may be atrocious but these shortcomings haven’t stopped me ranking lots of pages over the years.

Google historically has looked for ‘exact match’ instances of keyword phrases on documents and SEO have, historically, been able to optimise successfully for these keyword phrases – whether they are grammatically correct or not.

So how could bad grammar carry negative weight in Google’s algorithms?

That being said, I do have Grammarly, a spelling and grammar checking plugin installed on my browser to help me catch the obvious mistakes.

Advice From Google

John Mueller from Google said in a recent hangout that it was ‘not really’ but that it was ‘possible‘ but very ‘niche‘ if at all, that grammar was a positive ranking factor. Bear in mind – most of Google’s algorithms (we think) demote or de-rank content once it is analysed – not necessarily promote it – not unless users prefer it.

Another video I found is a Google spokesman talking about inadequate grammar as a ranking factor or page quality signal was from a few years ago.

In this video, we are told, by Google, that grammar is NOT a ranking factor.

Not, at least, one of the 200+ quality signals Google uses to rank pages.

And that rings true, I think.

Google’s Matt Cutts did say though:

QUOTE: “It would be fair to use it as a signal…The more reputable pages do tend to have better grammar and better spelling. ” Matt Cutts, Google

Google Panda & Content Quality

Google is on record as saying (metaphorically speaking) their algorithms are looking for signals of low quality when it comes to rating pages on Content Quality.

Some possible examples could include:

QUOTE: “1. Does this article have spelling, stylistic, or factual errors?”

and

QUOTE: “2. Was the article edited well, or does it appear sloppy or hastily produced?”

and

QUOTE: “3. Are the pages produced with great care and attention to detail vs. less attention to detail?”

and

QUOTE: “4. Would you expect to see this article in a printed magazine, encyclopedia or book?”

Altogether – Google is rating content on overall user experience as it defines and rates it, and bad grammar and spelling equal a poor user experience.

At least on some occasions.

Google aims to ensure organic search engine marketing be a significant investment in time and budget for businesses. Critics will say to make Adwords a more attractive proposition.

Google aims to reward quality signals that:

  1. take time to build and
  2. the vast majority of sites will not, or cannot meet without a lot of investment.

NO website in a competitive market gets valuable traffic from Google without a lot of work. Technical work and content curation.

It’s an interesting aside.

Fixing the grammar and spelling on a page can be a time-consuming process.

It’s clearly a machine-readable and detectable – although potentially noisy – signal and Google IS banging on about Primary MAIN Content Quality and User Experience.

Who knows?

Grammar is ranking factor could be one for the future – but at the moment, I doubt grammar is taken much into account (on an algorithmic level, at least, although users might not like your grammar and that could have a second order impact if it causes high abandonment rates, for instance).

Is Spelling A Google Ranking Factor?

Poor spelling has always had the potential to be a NEGATIVE ranking factor in Google. IF the word that is incorrect on the page is unique on the page and of critical importance to the search query.

Although – back in the day – if you wanted to rank for misspellings – you optimised for them – so – poor spelling would be a POSITIVE ranking looking back not that long ago.

Now, that kind of optimisation effort is fruitless, with changes to how Google presents these results.

Google will favour “Showing results for” results over presenting SERPs based on a common spelling error.

Testing to see if ‘bad spelling’ is a ranking factor is still easy on a granular level, bad grammar is not so easy to test.

I think Google has better signals to play with than ranking pages on spelling and grammar. It’s not likely to penalise you for the honest mistakes most pages exhibit, especially if you have met more important quality signals – like useful main content.

And I’ve seen clear evidence of pages ranking very well with both bad spelling and bad grammar. My own!

I still have Grammarly installed, though.

Google is policing their SERPs.

Put simply Google’s views on ‘site quality’ and ‘user satisfaction’ do NOT automatically correlate to you getting more traffic.

This endeavour is supposed to be a benchmark – a baseline to start from (when it comes to keywords with financial value).

Everybody, in time, is supposed to hit this baseline to expect to have a chance to rank – and for the short, to medium term, this is where the opportunity for those who take it can be found.

If you don’t do it, someone else will, and Google will rank them, in time, above you.

Google has many human quality raters rating your offering, as well as algorithms targeting old-style SEO techniques and engineers specifically looking for sites that do not meet technical guidelines.

How To Do “SEO Copywriting

Good Content Will Still Need ‘Optimised’

The issue is, original “compelling content” – so easy to create isn’t it(!) – on a site with no links and no audience and no online business authority is as useful as boring, useless content – to Google – and will be treated as such by Google – except for long tail terms (if even).

It usually won’t be found by many people and won’t be READ and won’t be ACTED upon – not without a few good links pointing to the site – NOT if there is any competition for the term.

Generalisations make for excellent link bait and while good, rich content is very important, sayings like ‘content is everything’ is not telling you the whole story.

The fact is – every single site is different, sits in a niche with a different level of competition for every keyword or traffic stream, and needs a strategy to tackle this.

There’s no one size fits all magic button to press to get traffic to a site. Some folk have a lot of domain authority to work with, some know the right people, or have access to an audience already – indeed, all they might need is a copywriter – or indeed, some inspiration for a blog post.

They, however, are in the minority of sites.

Most of the clients I work with have nothing to start with and are in a relatively ‘boring’ niche few reputable blogs write about.

In one respect, Google doesn’t even CARE what content you have on your site (although it’s better these days at hiding this).

Humans do care, of course, so at some point, you will need to produce that content on your pages.

You Can ALWAYS Optimise Content To Perform Better In Google

An SEO can always get more out of content in organic search than any copywriter, but there’s not much more powerful than a copywriter who can lightly optimise a page around a topic, or an expert in a topic that knows how to – continually, over time – optimise a page for high rankings in Google.

Screenshot 2016-01-26 00.43.31

If I wanted to rank for “How To Write For Google“? – for instance – in the old days you used to put the key phrase in the normal elements like the Page Title Element and ALT text and then keyword stuffed your text to make sure you repeated “How To Write For Google” enough times in a block of low-quality text.

Using variants and synonyms of this phrase helped to add to the ‘uniqueness’ of the page, of course.

Throwing in any old text would beef the word count up.

Now if I want to rank high in Google for that kind of term – I would still rely on old SEO best practices like a very focused page title – but now the text should explore a topic in a much more informative way.

Writing for Google and meeting the query intent means an SEO copywriter would need to make sure page text included ENTITIES AND CONCEPTS related to the MAIN TOPIC of the page you are writing about and the key phrase you are talking about.

If I wanted a page to rank for this term, I would probably need to explore concepts like Google Hummingbird, Query Substitution, Query Reformation and Semantic Search i.e. I need to explore a topic or concept in fully – and as time goes on – more succinctly – than competing pages.

If you want to rank for a SPECIFIC search term – you can still do it using the same old, well-practised keyword targeting practices. The main page content itself just needs to be high-quality enough to satisfy Google’s quality algorithms in the first place.

This is still a land grab.

How To Improve Your Website Content

This is no longer about repeating keywords. ANYTHING you do to IMPROVE the page is going to be a potential SEO benefit. That could be:

  • creating fresh content
  • removing doorway-type pages
  • cleaning up or removing thin-content on a site
  • adding relevant keywords and key phrases to relevant pages
  • constantly improving pages to keep them relevant
  • fixing poor grammar and spelling mistakes
  • adding synonyms and related key phrases to text
  • reducing keyword stuffing
  • reducing the ratio of duplicated text on your page to unique text
  • removing old outdated links or out-of-date content
  • rewording sentences to take out sales or marketing fluff and focusing more on the USER INTENT (e.g. give them the facts first including pros and cons – for instance – through reviews) and purpose of the page.
  • merging many old stale pages into one, fresh page, which is updated periodically to keep it relevant
  • Conciseness, while still maximising relevance and keyword coverage
  • Improving important keyword phrase prominence throughout your page copy (you can have too much, or too little, and it is going to take testing to find out what is the optimal presentation will be)
  • Topic modelling

A great writer can get away with fluff but the rest of us probably should focus on being concise.

Low-quality fluff is easily discounted by Google these days – and can leave a toxic footprint on a website.

How To Get Featured Snippets on Google

QUOTE: “When a user asks a question in Google Search, we might show a search result in a special featured snippet block at the top of the search results page. This featured snippet block includes a summary of the answer, extracted from a webpage, plus a link to the page, the page title and URL” Google 2018

Any content strategy should naturally be focused on creating high-quality content and also revolve around triggering Google FEATURED SNIPPETS that trigger when Google wants them to – and intermittently – depending on the nature of the query.

Enhanced snippet; Google Answer Box Example

Regarding the above image, where a page on Hobo is promoted to number 1 – I used traditional competitor keyword research and old-school keyword analysis and keyword phrase selection, albeit focused on the opportunity in long-form content, to accomplish that, proving that you still use this keyword research experience to rank a page.

Despite all the obfuscation, time delay, keyword rewriting, manual rating and selection bias Google goes through to match pages to keyword queries to, you still need to optimise a page to rank in a niche, and if you do it sensibly, you unlock a wealth of long-tail traffic over time (a lot of which is useless as it always was, but what RankBrain might clean up given time).

Note:

  • Google is only going to produce more of these direct answers or answer boxes in future (they have been moving in this direction since 2005).
  • Focusing on triggering these will focus your content creators on creating exactly the type of pages Google wants to rank. “HOW TO” guides and “WHAT IS” guides is IDEAL and the VERY BEST type of content for this exercise.
  • Google is REALLY rewarding these articles – and the search engine is VERY probably going to keep doing so for the future.
  • Google Knowledge Graph offers another exciting opportunity – and indicates the next stage in organic search.
  • Google is producing these ANSWER BOXES that can promote a page from anywhere on the front page of Google to number 1.
  • All in-depth content strategy on your site should be focused on this new aspect of Google Optimisation. The bonus is you physically create content that Google is ranking very well even without taking knowledge boxes into consideration.
  • Basically – you are feeding Google EASY ANSWERS to scrape from your page. This all ties together very nicely with organic link building. The MORE ANSWER BOXES you UNLOCK – the more chance you have of ranking number one FOR MORE AND MORE TERMS – and as a result – more and more people see your utilitarian content and as a result – you get social shares and links if people care at all about it.
  • You can share an Enhanced Snippet (or Google Answer Box as they were first called by SEOs). Sometimes you are featured and sometimes it is a competitor URL. All you can do in this case is to continue to improve the page until you squeeze your competitor out.

We already know that Google likes ‘tips’ and “how to” and expanded FAQ but this Knowledge Graph ANSWER BOX system provides a real opportunity and is CERTAINLY what any content strategy should be focused around to maximise exposure of your business in organic searches.

Unfortunately, this is a double-edged sword if you take a long-term view. Google is, after all, looking for easy answers so, eventually, it might not need to send visitors to your page.

To be fair, these Google Enhanced Snippets, at the moment, appear complete with a reference link to your page and can positively impact traffic to the page. SO – for the moment – it’s an opportunity to take advantage of.

Focus on Quality To Improve Conversion Rates

However you are trying to satisfy users, many think this is about terminating searches via your site or on your site or satisfying the long-click.

How you do that in an ethical manner (e.g. not breaking the back button on browsers) the main aim is to satisfy that user somehow.

You used to rank by being a virtual PageRank black hole. Now, you need to think about being a User black hole.

You want a user to click your result in Google, and not need to go back to Google to do the same search that ends with the user pogo-sticking to another result, apparently unsatisfied with your page.

The aim is to convert users into subscribers, returning visitors, sharing partners, paying customers or even just help them along on their way to learn something.

The success I have had in ranking pages and getting more traffic have largely revolved around optimising the technical framework of a site, crawl and indexing efficiency, removal of outdated content, content re-shaping, constant improvement of text content to meet its purpose better, internal links to relevant content, conversion optimisation or getting users to ‘stick around’ – or at least visit where I recommend they visit.

Mostly – I’ve focused on satisfying user intent because Google isn’t going back with that.

You don’t need only to stick to one topic area on a website. That is a myth.

If you create high-quality pieces of informative content on your website page-to-page, you will rank.

The problem is – not many people are polymaths – and this will be reflected in blog posts that end up too thin to satisfy users and in time, Google, or e-commerce sites that sell everything and have speciality and experience in little of it.

The only focus with any certainty is whatever you do, stay high-quality with content, and avoid creating doorway pages.

For some sites, that will mean reducing pages on many topics to a few that can be focused on so that you can start to build authority in that subject area.

Your website is an entity. You are an entity. Explore concepts. Don’t repeat stuff. Be succinct.

You are what keywords are on your pages.

You rank as a result of others rating your writing.

Avoid toxic visitors. A page must meet its purpose well, without manipulation. Do people stay and interact with your page or do they go back to Google and click on other results? A page should be explicit in its purpose and focus on the user.

The number 1 ‘user experience’ signal you can manipulate with low risk is improving content until it is more useful or better presented than is found on competing pages for variously related keyword phrases.

Ranking Factors

QUOTE: “Rankings is a nuanced process and there is over 200 signals.” Maile Ohye, Google 2010

Google has HUNDREDS of ranking factors with signals that can change daily, weekly, monthly or yearly to help it work out where your page ranks in comparison to other competing pages in SERPs.

You will not ever find every ranking factor. Many ranking factors are on-page or on-site and others are off-page or off-site. Some ranking factors are based on where you are, or what you have searched for before.

QUOTE: “Google crawls the web and ranks pages. Where a page ranks in Google is down to how Google rates the page. There are hundreds of ranking signals….” Shaun Anderson, Hobo 2020

I’ve been in online marketing for over 20 years. In that time, a lot has changed. I’ve learned to focus on aspects that offer the greatest return on investment of your labour.

Learn The Basics

QUOTE: “While search engines and technology are always evolving, there are some underlying foundational elements that have remained unchanged from the earliest days….” Search Engine Journal 2020

Here are few simple tips to begin with:

Can I Fool Google Into Ranking Me Top?

If you are just starting out, don’t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it’s best to keep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out – you may as well learn how to do it within Google’s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google’s guidelines, or not, and stick to it. Don’t be caught in the middle with an important project. Do not always follow the herd.

If your aim is to deceive visitors from Google, in any way, Google is not your friend. Google is hardly your friend at any rate – but you don’t want it as your enemy. Google will send you lots of free traffic though if you manage to get to the top of search results, so perhaps they are not all that bad.

A lot of techniques that are in the short term effective at boosting a site’s position in Google are against Google’s guidelines. For example, many links that may have once promoted you to the top of Google, may, in fact, today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back. You must be smart, and cautious, when it comes to building links to your site in a manner that Google *hopefully* won’t have too much trouble with, in the FUTURE. Because they will punish you in the future.

Don’t expect to rank number 1 in any niche for a competitive keyword phrase without a lot of investment and work. Don’t expect results overnight. Expecting too much too fast might get you in trouble with the Google webspam team.

You don’t pay anything to get into Google, Yahoo or Bing natural, or free listings. It’s common for the major search engines to find your website pretty quickly by themselves within a few days. This is made so much easier if your cms actually ‘pings’ search engines when you update content (via XML sitemaps or RSS for instance).

To be listed and rank high in Google and other search engines, you really should consider and mostly abide by search engine rules and official guidelines for inclusion. With experience and a lot of observation, you can learn which rules can be bent, and which tactics are short-term and perhaps, should be avoided.

Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)

I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.

If you have original, quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you have original content of sufficient quality on your site, you can then let authority websites – those with online business authority – know about it, and they might link to you – this is called a quality backlink.

Search engines need to understand that ‘a link is a link’ that can be trusted. Links can be designed to be ignored by search engines with the rel nofollow attribute.

Search engines can also find your site by other websites linking to it. You can also submit your site to search engines directly, but I haven’t submitted any site to a search engine in the last ten years – you probably don’t need to do that. If you have a new site, I would immediately register it with Google Search Console these days.

Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.

Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.

Google WILL classify your site when it crawls and indexes your site – and this classification can have a DRASTIC effect on your rankings. It’s important for Google to work out WHAT YOUR ULTIMATE INTENT IS – do you want to be classified as a thin affiliate site made ‘just for Google’, a domain holding page or a small business website with a real purpose? Ensure you don’t confuse Google in any way by being explicit with all the signals you can – to show on your website you are a real business, and your INTENT is genuine – and even more important today – FOCUSED ON SATISFYING A VISITOR.

NOTE – If a page exists only to make money from Google’s free traffic – Google calls this spam. I go into this more, later in this guide.

The transparency you provide on your website in text and links about who you are, what you do, and how you’re rated on the web or as a business is one way that Google could use (algorithmically and manually) to ‘rate’ your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.

To rank for specific keyword phrase searches, you usually need to have the keyword phrase or highly relevant words on your page (not necessarily all together, but it helps) or in links pointing to your page/site.

Ultimately what you need to do to compete is largely dependent on what the competition for the term you are targeting is doing. You’ll need to at least mirror how hard they are competing if a better opportunity is hard to spot.

As a result of other quality sites linking to your site, the site now has a certain amount of real PageRank that is shared with all the internal pages that make up your website that will in future help provide a signal to where this page ranks in the future.

Yes, you need to attract links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Google Search Console if you sign up.

Link building is not JUST a numbers game, though. One link from a “trusted authority” site in Google could be all you need to rank high in your niche. Of course, the more “trusted” links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google.

Try and get links within page text pointing to your site with relevant, or at least, natural looking, keywords in the text link – not, for instance, in blogrolls or site-wide links. Try to ensure the links are not obviously “machine generated” e.g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits.

Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.

Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “keyword-phrase” as opposed to a naked url or “click here“. Beware, though that Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I do not optimise for grammatically incorrect terms these days (especially with links).

I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.

Search engines like Google ‘spider’ or ‘crawl’ your entire site by following all the links on your site to new pages, much as a human would click on the links to your pages. Google will crawl and index your pages, and within a few days usually, begin to return your pages in SERPs.

After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.

Ideally, you will have unique pages, with unique page titles and unique page meta descriptions . Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I don’t worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals.

Google will take some time to analyse your entire site, examining text content and links. This process is taking longer and longer these days but is ultimately determined by your domain reputation and real PageRank.

If you have a lot of duplicate low-quality text already found by Googlebot on other websites it knows about; Google will ignore your page. If your site or page has spammy signals, Google will penalise it, sooner or later. If you have lots of these pages on your site – Google will ignore most of your website.

You don’t need to keyword stuff your text to beat the competition.

You optimise a page for more traffic by increasing the frequency of the desired key phrase, related key terms, co-occurring keywords and synonyms in links, page titles and text content. There is no ideal amount of text – no magic keyword density. Keyword stuffing is a tricky business, too, these days.

I prefer to make sure I have as many UNIQUE relevant words on the page that make up as many relevant long tail queries as possible.

If you link out to irrelevant sites, Google may ignore the page, too – but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER – I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites, for example, don’t do well in Google these days without some good quality backlinks and higher quality pages.

Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out.

I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much.

Original content is king and will attract a “natural link growth” – in Google’s opinion. Too many incoming links too fast might devalue your site, but again. I usually err on the safe side – I always aimed for massive diversity in my links – to make them look ‘more natural’. Honestly, I go for natural links full stop, for this website.

Google can devalue whole sites, individual pages, template generated links and individual links if Google deems them “unnecessary” and a ‘poor user experience’.

Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.

Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank ‘thin’ pages in results – any page you want to rank – should have all the things Google is looking for. That’s a lot these days!

It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.

Consider linking to important pages on your site from your home page, and other important pages on your site.

Focus on RELEVANCE first. Then, focus your marketing efforts and get REPUTABLE. This is the key to ranking ‘legitimately’ in Google.

Google changes its algorithm to mix things up. Google Panda and Google Penguin are two such updates, but the important thing is to understand Google changes its algorithms constantly to control its listings pages (over 600 changes a year we are told).

The art of rank modification is to rank without tripping these algorithms or getting flagged by a human reviewer – and that is tricky!

Focus on improving website download speeds at all times. The web is changing very fast, and a fast website is a good user experience.

Search Engine Submission

QUOTE: “You don’t pay to get into any of the big search engines natural (free or organic) listings. Google has ways of submitting your web pages directly to their index. Most search engines do.” Shaun Anderson, Hobo 2020

Don’t get scammed:

On-Page; Technical SEO

If you are doing a professional audit for a real business, you are going to have to think like a Google Search Quality Rater AND a Google search engineer to provide real long-term value to a client.

When making a site for Google you really need to understand that Google has a long list of things it will mark sites down for, and that’s usually old-school tactics which are now classed as ‘webspam‘.

Conversely, sites that are not marked “low-quality” are not demoted and so will improve in rankings. Sites with higher rankings often pick up more organic links, and this process can float high-quality pages on your site quickly to the top of Google.

So the sensible thing for any webmaster is to NOT give Google ANY reason to DEMOTE a site. Tick all the boxes Google tell you to tick, so to speak.

On-Page; Canonical Link Element

QUOTE: “The canonical link element is extremely powerful and very important to include on your page. Every page on your site should have a canonical link element, even if it is self referencing. It’s an easy way to consolidate ranking signals from multiple versions of the same information. Note: Google will ignore misused canonicals given time.” Shaun Anderson, Hobo 2020

On-Page; Website Page Speed

  • In short, your website should load as fast as possible!
  • Ideal website load time for mobile sites is 1-2 seconds.
  • 53% of mobile site visits are abandoned if pages take longer than 3 seconds to load.
  • A 2-second delay in load time resulted in abandonment rates of up to 87%.
  • Google itself aims for under half-a-second load time
  • A VERY SLOW SITE can be a NEGATIVE Google Ranking factor.
  • The average load time for mobile sites is 19 seconds over 3G connections. Models predict that publishers whose mobile sites load in 5 seconds earn up to 2x more mobile ad revenue than those whose sites load in 19 seconds.
  • People would not return to websites that took longer than four seconds to load and formed a “negative perception” of a company with a badly put-together site or would tell their family and friends about their experiences.
  • Slow load times are a primary reason visitors abandon a checkout process.
  • In studies, Page Time Load goes from 1s to 3s – the probability of bounce increases 32%.
  • In studies, Page Time Load goes from 1s to 5s – the probability of bounce increases 90%.
  • In studies, Page Time Load goes from 1s to 6s – the probability of bounce increases 106%.
  • In studies, Page Time Load goes from 1s to 10s – the probability of bounce increases 123%.
  • In a recent study, the average load time for a web page was 3.21s.
  • In a recent study, the average load time for a mobile web page is 22 seconds.

QUOTE: “The mobile version of a website should ideally load in under 3 seconds and the faster the better. A VERY SLOW SITE can be a NEGATIVE Ranking factor. There is no set threshold or speed score to meet, just to make your page as fast as possible.” Shaun Anderson, Hobo 2020

On-Page; Mobile-Friendly, Responsive Web Pages

  • Design for desktop displays from 1024×768 through 1920×1080
  • Design for mobile displays from 360×640 through 414×896
  • Design for tablet displays from 601×962 through 1280×800
  • Check Google Analytics and optimise for your target audience’s most common resolution sizes
  • Do not design for one monitor size or screen resolution. Screen sizes and browser window state vary among visitors.
  • Design should be responsive and fast. Use a liquid or responsive layout that transforms to the current user’s window size.
  • Monitor Google Search Console mobile-friendly and usability alerts

QUOTE: “There’s no best screen size to design for. Websites should transform responsively and fast on multiple resolutions on different browsers and platforms. Mobile-friendly and accessible. Design for your audience, first.” Shaun Anderson, Hobo 2020

On-Page; Page Title Element

  • Be descriptive & relate specifically to the page
  • Avoid keyword stuffing
  • Avoid adding irrelevant keywords to page titles on your site
  • Do not repeat page titles throughout your site
  • Maximise usability across devices with a concise title element of up to 60 characters
  • Avoid long titles. Observations could indicate that you penalise yourself with long titles.
  • Optimise for searcher intent using informational, commercial, navigational and transactional principles
  • Use only one title per page (Google combines multiple titles into one)
  • Add the important keyphrase to the title element once, and preferably in the first 8 words
  • When writing longer title tags ensure the first 50-60 characters include the primary keyword phrase and that this sentence makes perfect sense (because it will be truncated)
  • Use interrogative adverbs, adjectives and pronouns in page titles to form questions (Who, Where, Why, What, How etc)
  • Avoid extraneous boilerplate title elements across a site with many pages (site-wide keyword stuffing)

QUOTE: “To maximise usability across devices, stick to a shorter concise title of between 50-60 characters. Expect Google to count up to 12 words maximum in a title element. There should only be one page title element on a page.” Shaun Anderson, Hobo 2020

On-Page; Alt Tags & ALT Attribute Text

  • Googlebot is blind, too, like 2 million people in the UK
  • Be descriptive and concise as possible. Avoid keyword stuffing for search engines
  • Accessibility practitioners recommend a maximum of 80 characters, so use common sense and be succinct with ALT text
  • JAWS, a popular screen reader, breaks ALT text into 125 character “chunks” for visually impaired users
  • Avoid optimising ALT text “for search engines” as manipulating Alt Attribute text for SEO benefits often leads to accessibility and ranking problems
  • Google will count alternative text found in Image Alt Text when ranking pages and that Alt text is counted as part of the on-page text
  • ALT text is a very light ranking signal compared to other stronger signals, but the text in ALT text is very useful for some longtail searches
  • ALT attribute text is meant to describe images for accessibility purposes, not just search engine optimisation
  • A text equivalent for every image should be provided for visually impaired visitors.
  • Every image on a page should have an ALT attribute. Failure to include ALT tags with images represents a Priority 1 WCAG error and would mean your website would not be able to comply with basic UK DDA and SECTION 508 (in the US) recommendations
  • Use empty alt tags – (alt="") (NULL ALT) for spacer images or graphic elements on a page used only for design purposes-
  • Alt text helps Google understand what an image is about and as Googlebot does not see images directly, Google uses information provided in the “alt” attribute
  • Google will count 16 words maximum as part of image ALT attribute Text (use Longdesc for complex images – although this is depracted in HTML5)
  • Use ALT attribute text for descriptive text that helps visitors. Alt attribute should be used to describe the image.
  • Don’t repeat text already on the page, making the ALT text redundant or superfluous
  • Don’t use “image of ..” or “graphic of ..” to describe the image in ALT text
  • If the content of any image is already presented in context, (alt="")  – a null value – is appropriate
  • Don’t use ALT Attribute text alongside aria- attribute
  • Google is on record as labeling missing ALT text and Keyword stuffing as “bad”

QUOTE: “Keep ALT Attribute text on web pages to between 80-125 characters maximum for accessibility purposes. Expect Google to count a maximum of 16 words (approximately 125+ characters) as part of your ALT text that will be a signal used to rank your page for a relevant query.” Shaun Anderson, Hobo 2020

On-Page; Meta Keywords Tag

QUOTE: “In all the years I’ve been testing, Google ignores keywords in meta keywords. Google has also confirmed they ignore meta keywords. You can safely ignore optimising meta keywords across a site. I often remove them from a site, too.” Shaun Anderson, Hobo 2020

On-Page; Meta Description Tag

QUOTE: “Create a 50-word (300 character) summary of your page using short sentences. Add it to article and repeat it in the meta description. Avoid repetition. Be succinct. Do not keyword stuff. Make meta tags unique to the page. Review your SERP snippet. Programmatically generate meta descriptions on larger sites.” Shaun Anderson, Hobo 2020

On-Page; Robots Meta Tag

QUOTE: “There are various ways you can use a Robots Meta Tag but remember Google by default will index and follow links, so you have no need to include that as a command – you can leave the robots meta out completely – and probably should if you don’t know anything about it.” Shaun Anderson, Hobo 2020

On-Page; H1-H6: Page Headings

QUOTE: “Heading tags have influence when it comes to ranking in Google. Though it doesn’t matter to Google, I stick to one ‘H1’ on the page and use headings ‘H2’, ‘H3′ were appropriate. Keep headings in order. Avoid using headings for design elements. Write naturally with keywords in Headings if relevant. Avoid keyword stuffing.” Shaun Anderson, Hobo 2020

On-Page; Keyword Rich, Search Engine Friendly URLs

QUOTE: “Keywords in URLS are a tiny ranking factor and user-friendly. Keep under 50 characters to display fully on desktop search. Long URLS are truncated. I use them on new sites. I avoid changing URLs on an otherwise perfectly functional website.” Shaun Anderson, Hobo 2020

On-Page; Absolute Or Relative URLs

QUOTE: “My advice would be to keep it consistent whatever you decide to use. Listen to your developers.” Shaun Anderson, Hobo 2020

On-Page; Keyword Density

  • There is no single best keyword density to rank high in Google or Bing.
  • Optimal keyword density differs from page to page, phrase to phrase
  • Write naturally and include the keyword phrase once or twice on-page.
  • Avoid demotion in Google by avoiding repeating keyword phrases in text content.
  • Focus on creating high-quality engaging content instead.

QUOTE: There is no “best” keyword density. Write naturally and include the keyword phrase once or twice to rank in Google and avoid demotion. If you find you are repeating keyword phrases you are probably keyword stuffing your text.” Shaun Anderson, Hobo 2020

On-Page; Keyword Stuffing (Irrelevant Keywords)

QUOTE: “Keyword stuffing is simply the process of repeating the same keyword or key phrases over and over in a page. It’s counterproductive. It’s is a signpost of a very low-quality spam site and is something Google clearly recommends you avoid.” Shaun Anderson, Hobo 2020

On-Page; Link Title Attributes, Acronym & ABBR Tags

Does Google Count Text in The Acronym Tag?

From my tests, no. From observing how my test page ranks – Google is ignoring keywords in the acronym tag.

My observations from a test page I observe include;

  • Link Title Attribute – no benefit passed via the link either to another page, it seems. Potentially useful for image search.
  • ABBR (Abbreviation Tags) – No
  • Image File Name – No
  • Wrapping words (or at least numbers) in SCRIPT – Sometimes. Google is better at understanding what it can render.

It’s clear many invisible elements of a page are completely ignored by Google.

Some invisible items are (still) aparently supported:

  • NOFRAMES – Yes
  • NOSCRIPT – Yes
  • ALT Attribute – Yes

Unless you really have cause to focus on any particular invisible element, I think the **P** tag is the most important tag to optimise!

On-Page; Supplementary Content (SC)

I think this is incredibly important to note, even though it was de-emphasised in more recent versions of the quality rater guidelines:

QUOTE: “With this version we see some interesting changes.  Most noticeably is the de-emphasis of supplementary content, surprising since previous versions have stressed the importance of the additional supplementary content there is on the page – or the negative impact that content has.” Jennifer Slegg, 2016

An example of “supplementary” content is “navigation links that allow users to visit other parts of the website” and “footers” and “headers.”

Also consider your CTA (Call To Actions) on page and WHICH pages on your site you are sending visitors to from the target page. Remember this – low quality pages can negatively impact the rankings of other pages on your site.

These guidelines have been removed from the rater guidelines.

What is SC (supplementary content)?

QUOTE: “Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve it purpose. SC is controlled by webmasters and is an important part of the user experience. One common type of SC is navigation links that allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page. Sometimes the easiest way to identify SC is to look for the parts of the page that are not MC or Ads. ” Google Search Quality Evaluator Guidelines 2019

When it comes to a web page and positive UX, Google talks a lot about the functionality and utility of Helpful Supplementary Content – e.g. helpful navigation links for users (that are not, generally, MC or Ads).

Most of this advice is relevant to the desktop version of your site, and has actually been removed from recent quality rater guidelines but I think this is still worth noting, even with mobile first indexing.

QUOTE: “To summarize, a lack of helpful SC may be a reason for a Low quality rating, depending on the purpose of the page and the type of website. We have different standards for small websites which exist to serve their communities versus large websites with a large volume of webpages and content. For some types of “webpages,” such as PDFs and JPEG files, we expect no SC at all.” Google Search Quality Evaluator Guidelines 2015

It is worth remembering that Good supplementary content cannot save Poor main content from a low-quality page rating:

QUOTE: “Main Content is any part of the page that directly helps the page achieve its purpose“. Google Search Quality Evaluator Guidelines 2020

Good SC seems to certainly be a sensible option. It always has been.

Key Points about SC

  • Supplementary Content can be a large part of what makes a High-quality page very satisfying for its purpose.
  • Helpful SC is content that is specifically targeted to the content and purpose of the page.
  • Smaller websites such as websites for local businesses and community organizations, or personal websites and blogs, may need less SC for their purpose.
  • A page can still receive a High or even Highest rating with no SC at all.

Here are the specific quotes containing the term SC:

  • Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose.
  • SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.
  • SC which contributes to a satisfying user experience on the page and website. – (A mark of a high-quality site – this statement was repeated 5 times)
  • However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.
  • However, some pages are deliberately designed to shift the user’s attention from the MC to the Ads, monetized links, or SC. In these cases, the MC becomes difficult to read or use, resulting in a poor user experience. These pages should be rated Low.
  • Misleading or potentially deceptive design makes it hard to tell that there’s no answer, making this page a poor user experience.
  • Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
  • However, you may encounter pages with a large amount of spammed forum discussions or spammed user comments. We’ll consider a comment or forum discussion to be “spammed” if someone posts unrelated comments which are not intended to help other users, but rather to advertise a product or create a link to a website. Frequently these comments are posted by a “bot” rather than a real person. Spammed comments are easy to recognize. They may include Ads, download, or other links, or sometimes just short strings of text unrelated to the topic, such as “Good,” “Hello,” “I’m new here,” “How are you today,” etc. Webmasters should find and remove this content because it is a bad user experience.
  • The modifications make it very difficult to read and are a poor user experience. (Lowest quality MC (copied content with little or no time, effort, expertise, manual curation, or added value for users))
  • Sometimes, the MC of a landing page is helpful for the query, but the page happens to display porn ads or porn links outside the MC, which can be very distracting and potentially provide a poor user experience.
  • The query and the helpfulness of the MC have to be balanced with the user experience of the page.

On-Page; Optimise Supplementary Content on the Page

This is important for desktop but mobile too. Once you have the content, you need to think about supplementary content and secondary links that help users on their journey of discovery.

That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic – you should be LINKING OUT to other helpful resources e.g. other websites.

A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.

  • TASK – On informational pages, LINK OUT to related information pages on other sites AND on other pages on your own website where RELEVANT
  • TASK – For e-commerce pages, ADD RELATED PRODUCTS.
  • TASK – Create In-depth Content Pieces with on-page navigation arrays to named anchors on the page
  • TASK – Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links
  • TASK – Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject
  • TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
  • TASK – Do not over-optimise the link-relationship between your YMYL pages and your INFO-type article content.

On-Page; Main Content (MC)

QUOTE: “(Main CONTENT) is (or should be!) the reason the page exists.” Google Search Quality Evaluator Guidelines 2019

What Is Google Focused On?

Google is concerned with the PURPOSE of a page, the MAIN CONTENT (MC) of a page, the SUPPLEMENTARY CONTENT of a page and HOW THAT PAGE IS monetised, and if that monetisation impacts the user experience of consuming the MAIN CONTENT.

Webmasters need to be careful when optimising a website for CONVERSION first if that gets in the way of a user’s consumption of the main content on the page.

Google also has a “Page Layout Algorithm” that demotes pages with a lot of advertising “above the fold” or that forces users to scroll past advertisements to get to the Main Content of the page.

High-quality supplementary content should “(contribute) to a satisfying user experience on the page and website.” and it should NOT interfere or distract from the MC.

Google says,“(Main CONTENT) is (or should be!) the reason the page exists.” so this is probably the most important part of the page, to Google.

On-Page; Tell Visitors When Content Was Published or Last Updated

QUOTE: “The date should be unambiguous and easy to understand for maximum accessibility, hence why I always include the month in words rather than only numeric values. If I did rely only on numbers for dates I could end up displaying modification dates as 11/11/11.” Shaun Anderson, Hobo 2020

Googlers may have recently indicated perhaps the best way to add dates to your documents:

QUOTE: “DD/MM/YYYY is hard to misunderstand and it’s easy to parse.“. Gary Illyes, Google 2020

 

On-Page; Keyword Research

QUOTE: “Adding one word to your page can make all the difference between ranking No1 in Google and not ranking at all. Customers won’t find you unless YOU optimise for the phrases THEY use.” Shaun Anderson, Hobo 2020

On-Page; Broken Links

QUOTE: “Broken links are a waste of link power and could hurt your site, drastically in some cases, if a poor user experience is identified by Google. Google is a link based search engine – if your links are broken, you are missing out on the benefit you would get if they were not broken.” Shaun Anderson, Hobo 2020

On-Page; Point Internal Links To Relevant Pages

  • Ensure all pages on your site are linked to from other pages on your site.
  • Keep internal navigation simple, consistent and clear.
  • Do not keyword stuff internal anchor text.
  • Do not use rel=nofollow on internal links.
  • Do not link to low-quality pages on your site or off-site.
  • Fix broken links on your site. Broken links also mess up Pagerank and anchor text flow to pages on your site. Broken links on your site are often a different issue than the 404 errors Google shows in Webmaster tools. When 404s are present on your site, they hurt. When they are just 404s in Webmaster tools and not present on your pages these are less of an issue to worry about.
  • Pages do NOT need to be three clicks away from the landing page, but it is useful to think about the concept of the 3 click rule, I think when designing a navigation around your site. The simpler the better.
  • There is a benefit to linking to important pages often, but just because a page is linked to a LOT in an internal architecture will not necessarily make the page rank much better even with more Google Pagerank pumped into it. Relevance algorithms, page quality and site quality algorithms are all designed to float unique or satisfying pages to the top of the SERPs. As a direct result of this observation, I prefer to maximise the contextual value of internal links on smaller sites (rather than just make a page ‘link popular’). I go into ‘contextual value below).
  • If you believe in ‘first link priority’, you are going to have to take it into account when creating your main navigation system that appears on every page, and where that sits in the template.
  • Keep anchor text links within the limit of 16 keywords max. Keywords above the 16-word threshold limit seem to ‘evaporate’ in terms of any demonstrable value that I can show they pass.

QUOTE: “We do use internal links to better understand the context of content of your sites” John Mueller, Google 2015

I link to relevant internal pages in my site when necessary.

QUOTE: “Internal link building is the art age-old of getting pages crawled and indexed by Google. ” Shaun Anderson, Hobo 2020

I expect the full benefit of anchor text to flow from info-to-info-type articles, not info-to-YMYL pages.

QUOTE: “I’d forget everything you read about “link juice.” It’s very likely all obsolete, wrong, and/or misleading. Instead, build a website that works well for your users.” John Mueller, Google 2020

On-Page; Link Out To Related Sites

QUOTE: “I regularly link out to other quality relevant pages on other websites where possible and where a human would find it valuable.” Shaun Anderson, Hobo 2020

On-Page; Nofollow Links

  • Use rel=”sponsored” or rel=”nofollow” for paid links
  • Use rel=”ugc” or rel=”nofollow” for user generated content links
  • Use nofollow on widgets, themes and infographic links
  • Don’t use nofollow on every external link on your website
  • Don’t use nofollow on internal links
  • Link out normally to useful resources without using nofollow

QUOTE: “Google recommends that webmasters qualify outbound links so that these type of links do not artificially affect rankings. Simply: Use rel=”sponsored” or rel=”nofollow” for paid links. Use rel=”ugc” or rel=”nofollow” for user generated content links.” Shaun Anderson, Hobo 2020

On-Page; Website Accessibility

QUOTE: “It makes sense to create websites that are accessible to as many people as possible…. The W3c lay down the general guidelines for websites that most people adhere to.” Shaun Anderson, Hobo 2020

On-Page; Valid HTML & CSS

QUOTE: ” Well, a lot of websites can have invalid code but actually render just fine, because a lot of modern browsers do a good job dealing with bad code. And so, it’s not so much that the code has to be absolutely perfect, but whether or not the page is going to render well for the user in general. ” Danny Sullivan, Google 2011

Does Google rank a page higher because of valid code? The short answer is no.

QUOTE: “While there are benefits, Google doesn’t really care if your page is valid HTML and valid CSS. This is clear – check any top ten results in Google and you will probably see that most contain invalid HTML or CSS.” Shaun Anderson, Hobo 2020

On-Page; Which Is Better For Google? PHP, HTML or ASP?

Google doesn’t care. As long as it renders as a browser compatible document, it appears Google can read it these days. I prefer PHP these days even with flat documents as it is easier to add server-side code to that document if I want to add some sort of function to the site. Beware using client-side Javascript to render pages as often some content served using client-side JavaScript is not readable by Googlebot.

On-Page; Javascript

QUOTE: “The first important challenge to note about Javascript is that not every search engine treats JS the way Google does. The sensible thing is to keep things as simple as possible for maximum effectiveness.” Shaun Anderson, Hobo 2020

On-Page; Rich Snippets

Rich Snippets and Schema Markup can be intimidating if you are new to them – but important data about your business can actually be very simply added to your site:

QUOTE: “Google Search works hard to understand the content of a page. You can help us by providing explicit clues about the meaning of a page to Google by including structured data on the page. Structured data is a standardized format for providing information about a page and classifying the page content; for example, on a recipe page, what are the ingredients, the cooking time and temperature, the calories, and so on.” Google Developer Guides, 2020

On-Page; Check How Your Page Renders Using GSC Url Inspection Feature

QUOTE: “It is important that what Google (Googlebot) sees is (exactly) what a visitor would see if they visit your site. Blocking Google can sometimes result in a real ranking problem for websites.” ShaunAnderson, Hobo 2020

  • Pro Tip: The fetch and render Url Inspection feature is also a great way to submit your new content to Google for indexing so that the page may quickly appear in Google search results.
  • Pro Tip: Check the actual cache results in Google SERPs to double-check how Google renders your web page.

On-Page; Check How Your Page Renders On Other Browsers

QUOTE: “Your website CAN’T look the same in ALL of these browsers, but if it looks poor in most of the popular browsers, then you might have a problem.” Shaun Anderson, Hobo 2020

On-Site; Redirect Non-WWW To WWW (or Vice Versa)

QUOTE: “This is a MUST HAVE best practice. Basically, you are redirecting all ranking signals to one canonical version of a URL. It keeps it simple when optimising for Google. Do not mix the two types of www/non-www on-site when linking your internal pages.” Shaun Anderson, Hobo 2020

On-Site; 301 Redirects

QUOTE: “301 Redirects are an incredibly important and often overlooked area of search engine optimisation….. You can use 301 redirects to redirect pages, sub-folders or even entire websites and preserve Google rankings that the old page, sub-folder or website(s) enjoyed. Rather than tell Google via a 404, 410 or some other instruction that a page isn’t here anymore, you can permanently redirect an expired page to a relatively similar page to preserve any link equity that old page might have.” Shaun Anderson, Hobo 2020

On-Site; Have A Useful 404 Page

QUOTE: “It is incredibly important to create useful and proper 404 pages. This will help prevent Google recording lots of autogenerated thin pages on your site (both a security risk and a rankings risk).” Shaun Anderson, Hobo 2020

On-Site; XML Sitemap

QUOTE: “An XML Sitemap is a file on your server with which you can help Google easily crawl & index all the pages on your site. This is evidently useful for very large sites that publish lots of new content or updates content regularly.” Shaun Anderson, Hobo 2020

On-Site; Robots.txt File

QUOTE: “A robots.txt file is a file on your webserver used to control bots like Googlebot, Google’s web crawler. You can use it to block Google and Bing from crawling parts of your site.” Shaun Anderson, Hobo 2020

On-Site; Website Indexation Challenges

QUOTE: “Make sure Google can crawl your website, index and rank all your primary pages by only serving Googlebot high-quality, user friendly and fast loading pages to index.” Shaun Anderson, Hobo 2020

On-Site; Enough Satisfying Website Information for the Purpose of the Website

QUOTE: “If a page has one of the following characteristics, the Low rating is usually appropriate…..(if) There is an unsatisfying amount of website information for the purpose of the website.” Google Quality Rater Guide, 2017

Google wants evaluators to find out who owns the website and who is responsible for the content on it.

Your website also needs to meet the legal requirements necessary to comply with laws in your country. It’s easy to just incorporate this required information into your footer.

QUOTE: “Companies…. must include certain regulatory information on their websites and in their email footers before 1st January 2007 or they will breach the Companies Act and risk a fine.” Outlaw, 2006

Here’s what you need to know regarding website and email footers to comply with the Companies Act ;

———————————-

  • The Company Name –
    Physical geographic address (A PO Box is unlikely to suffice as a geographic address; but a registered office address would – If the business is a company, the registered office address must be included.)
  • The company’s registration number should be given and, under the Companies Act, the place of registration should be stated
  • Email address of the company (It is not sufficient to include a ‘contact us’ form without also providing an email address and geographic address somewhere easily accessible on the site)
  • The name of the organisation with which the customer is contracting must be given. This might differ from the trading name. Any such difference should be explained
  • If your business has a VAT number, it should be stated even if the website is not being used for e-commerce transactions.
  • Prices on the website must be clear and unambiguous. Also, state whether prices are inclusive of tax and delivery costs.

———————————-

The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).

Note: If the business is a member of a trade or professional association, membership details, including any registration number, should be provided.

Consider also the Distance Selling Regulations which contain other information requirements for online businesses that sell to consumers (B2C, as opposed to B2B, sales).

On-Page; Dynamic PHP Copyright Notice

QUOTE: “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” Google Quality Rater Guidelines

If you run a legitimate online business you’ll want to ensure your website never looks obviously out of date. A dynamic copyright notice is an option.

While you are editing your footer – ensure your copyright notice is dynamic and will change year to year – automatically.

It’s simple to display a dynamic date in your footer in WordPress, for instance, so you never need to change your copyright notice on your blog when the year changes.

This little bit of code will display the current year. Just add it in your theme’s footer.php and you can forget about making sure you don’t look stupid, or give the impression your site is out of date and unused, at the beginning of every year.

&copy; Copyright 2000 - <?php echo date("Y") ?>

A simple and elegant PHP copyright notice for WordPress blogs.

On-Page; Blogging

QUOTE: “I used it to power visits and sales for my company a bit before it was commonplace. Google (I include Google Feedburner in this) has sent this site almost 8 million unpaid organic visitors since I started blogging. There’s no single way to blog but here’s what I learned and how I did it if you are totally new to blogging.” Shaun Anderson, Hobo 2020

On-Page; How To Manage UGC (“User-Generated Content”)

QUOTE – “So it’s not that our systems will look at your site and say oh this was submitted by a user therefore the site owner has like no no control over what’s happening here but rather we look at it and say well this is your website this is what you want to have indexed you kind of stand for the content that you’re providing there so if you’re providing like low quality user-generated content for indexing then we’ll think well this website is about low quality content and spelling errors don’t necessarily mean that it’s low quality but obviously like it can go in all kinds of weird directions with user-generated content…” John Mueller, Google 2019

It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your page and your site.

In a 2015 hangout John Mueller said to “noindex untrusted post content” and going on says  posts by new posters who haven’t been in the forum before. threads that don’t have any answers. Maybe they’re noindexed by default.

A very interesting statement was “how much quality content do you have compared to low-quality content“. That indicates google is looking at this ratio. John says to identify “which pages are high-quality, which pages are lower quality so that the pages that do get indexed are really the high-quality ones.

John mentions looking too at “threads that don’t have any authoritative answers“.

I think that advice is relevant for any site with lots of UGC.

Google wants you to moderate any user generated content you publish, and will rate the page on it. Moderate comments for quality and apply rel=”nofollow” or rel=”ugc” to links in comments.

On-Page; User-Generated Content Is Rated As Part of the Page in Quality Scoring

QUOTE: “when we look at a page, overall, or when a user looks at a page, we see these comments as part of the content design as well. So it is something that kind of all combines to something that users see, that our algorithms see overall as part of the content that you’re publishing.” John Mueller, Google

Bear in mind that:

QUOTE: “As a publisher, you are responsible for ensuring that all user-generated content on your site or app complies with all applicable programme policies.” Google, 2018

Google want user-generated content on your site to be moderated and kept as high quality as the rest of your site.

They explain:

QUOTE: “Since spammy user-generated content can pollute Google search results, we recommend you actively monitor and remove this type of spam from your site.” Google, 2018

and

QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”

and

QUOTE: “When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google

On-Page; Moderate Comments

Some examples of spammy user-generated content include:

QUOTE: “Comment spam on blogs” Google, 2018

User-generated content (for instance blog comments) are counted as part of the page and these comments are taken into consideration when Google rates the page.

QUOTE: “For this reason there are many ways of securing your application and dis-incentivising spammers. 

  • Disallow anonymous posting.
  • Use CAPTCHAs and other methods to prevent automated comment spamming.
  • Turn on comment moderation.
  • Use the “nofollow” attribute for links in the comment field.
  • Disallow hyperlinks in comments.
  • Block comment pages using robots.txt or meta tags.” Google

On-Page; Moderate Forums

QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google

If you have a forum plugin on your site, moderate:

QUOTE: “Spammy posts on forum threads” Google, 2018

It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your site.

In a 2015 hangout John Mueller said to “noindex untrusted post content” and going on says  posts by new posters who haven’t been in the forum before. threads that don’t have any answers. Maybe they’re noindexed by default.

A very interesting statement was “how much quality content do you have compared to low-quality content“. That indicates Google is looking at this ratio. John says to identify “which pages are high-quality, which pages are lower quality so that the pages that do get indexed are really the high-quality ones.

John mentions looking at “threads that don’t have any authoritative answers“.

I think that advice is relevant for any site with lots of content.

On-Page; Moderate ANY User Generated Content

You are responsible for what you publish.

No matter how you let others post something on your website, you must ensure a high standard:

QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules). When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google 2016

and

QUOTE: “… it also includes things like the comments, includes the things like the unique and original content that you’re putting out on your site that is being added through user-generated content, all of that as well. So while I don’t really know exactly what our algorithms are looking at specifically with regards to your website, it’s something where sometimes you go through the articles and say well there is some useful information in this article that you’re sharing here, but there’s just lots of other stuff happening on the bottom of these blog posts. When our algorithms look at these pages, in an aggregated way across the whole page, then that’s something where they might say well, this is a lot of content that is unique to this page, but it’s not really high quality content that we want to promote in a very visible way. That’s something where I could imagine that maybe there’s something you could do, otherwise it’s really tricky I guess to look at specific changes you can do when it comes to our quality algorithms.” John Mueller, Google 2016

and

QUOTE: “Well, I think you need to look at the pages in an overall way, you should look at the pages and say, actually we see this a lot in the forums for example, people will say “my text is unique, you can copy and paste it and it’s unique to my website.” But that doesn’t make this website page a high quality page. So things like the overall design, how it comes across, how it looks like an authority, this information that is in general to webpage, to website, that’s things that all come together. But also things like comments where webmasters might say “this is user generated content, I’m not responsible for what people are posting on my website,”  John Mueller, Google 2016

and

QUOTE: “If you have comments on your site, and you just let them run wild, you don’t moderate them, they’re filled with spammers or with people who are kind of just abusing each other for no good reason, then that’s something that might kind of pull down the overall quality of your website where users when they go to those pages might say, well, there’s some good content on top here, but this whole bottom part of the page, this is really trash. I don’t want to be involved with the website that actively encourages this kind of behavior or that actively promotes this kind of content. And that’s something where we might see that on a site level, as well.” John Mueller, Google 2016

and

QUOTE: “When our quality algorithms go to your website, and they see that there’s some good content here on this page, but there’s some really bad or kind of low quality content on the bottom part of the page, then we kind of have to make a judgment call on these pages themselves and say, well, some good, some bad. Is this overwhelmingly bad? Is this overwhelmingly good? Where do we draw the line?” John Mueller, Google 2016

There’s a key insight for many webmasters managing UGC.

Watch out for those who want to use your blog for financial purposes both and terms of adding content to the site and linking to other sites.

QUOTE: “Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.” John Mueller, Google 2013

On-Page; You Must Moderate User-Generated Content If You Display Google Adsense

QUOTE: “Consider where user-generated content might appear on your site or app, and what risks to your site or app’s reputation might occur from malicious user-generated content. Ensure that you mitigate those risks before enabling user-generated content to appear.Set aside some time to regularly review your top pages with user-generated content. Make sure that what you see complies with all our programme policies.” Google Adsense Policies, 2018

and

QUOTE: “If a post hasn’t been reviewed yet and approved, allow it to appear, but disable ad serving on that page. Only enable ad serving when you’re sure that a post complies with our programme policies.” Google Adsense Policies, 2018

and

QUOTE: “And we do that across the whole website to kind of figure out where we see the quality of this website. And that’s something that could definitely be affecting your website overall in the search results. So if you really work to make sure that these comments are really high quality content, that they bring value, engagement into your pages, then that’s fantastic. That’s something that I think you should definitely make it so that search engines can pick that up on.”  John Mueller, Google 2016

You can also get a Google manual action penalty for user-generated spam on your site, which can affect parts of the site or the whole site.

QUOTE: “Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles. As a result, Google has applied a manual spam action to your site.” Google Notice of Manual Action

Off-Site; Unnatural Links

QUOTE: “Clean up the links you commissioned or made yourself.” Shaun Anderson, Hobo 2020

Off-Site; Link Building

Use content marketing:

  • Research a topic relevant to the subject matter of your website
  • Create a high-quality article on your website including 100% original research on the topic
  • Curate the best up-to-date advice out there on the topic from other experts
  • Cite and link to your sources
  • Publish the article on your website and to your newsletter
  • Optimise your post for featured snippets for primary keyword phrases
  • Automatically syndicate your blog post to Twitter, Facebook and Linkedin to boost social activity
  • Keep it updated and republish when ready next year
  • If you want natural links, create ‘linkable assets, ‘ e.g., pages on your website need to be a ‘destination’ that users engage with
  • Get links from real sites to build real ‘domain authority’ or rather ‘domain reputation’

QUOTE: “Link building is the process of earning links on other websites. Earned natural links directly improve the reputation of a website and where it ranks in Google. Self-made links are unnatural.  Google penalises unnatural links.” Shaun Anderson, Hobo 2020

Off-Site; Social Media

QUOTE: “While social media links themselves hold little value to Google because they are ‘nofollowed’, the buzz made possible by social media can lead to organic links and other positive ranking signals …” Shaun Anderson, Hobo 2020

How To Get Google To Crawl & Index Your Website

Google Search Console Indexation Report For Website

Make sure Google can crawl your website, index and rank all your primary pages by only serving Googlebot high-quality, user friendly and fast loading pages to index.

Check out the new Indexation Report In Google Search Console 

This is sure to be an invaluable addition to Google Search Console for some larger sites.

If you submit an XML sitemap file in Search Console, Google will help you better understand why certain pages are not indexed.

As you can see, Google goes to great lengths to help you to identify indexation problems on your website, including, in this example:

StatusReasonValidationTrendPages
ErrorSubmitted URL marked ‘noindex’
Started
5
ErrorServer errors (5xx)
N/A
0
ErrorSubmitted URL blocked by robots.txt
N/A
0
ErrorSubmitted URL seems to be a Soft 404
N/A
0
ExcludedExcluded by ‘noindex’ tag
N/A
81,984
ExcludedPage with redirect
N/A
5,982
ExcludedDuplicate page without canonical tag
N/A
4,908
ExcludedCrawled – currently not indexed
N/A
2,082
ExcludedDiscovered – currently not indexed
N/A
1,520
ExcludedBlocked by robots.txt
N/A
647
ExcludedAlternate page with proper canonical tag
N/A
201
ExcludedSoft 404
N/A
147
ExcludedSubmitted URL not selected as canonical
N/A
34
ValidIndexed, not submitted in sitemap
N/A
221,004
ValidSubmitted and indexed
N/A
2,144

How To Use The Indexation Report In Google Search Console

QUOTE: “Find out which of your pages have been crawled and indexed by Google. In this episode of Search Console Training…. how to use the Index Coverage report to assess its status and identify issues.” GoogleWMC, 2020

Will Google Index Every Page on Your Website?

No.

QUOTE: “We never index all known URLs, that’s pretty normal. I’d focus on making the site awesome and inspiring, then things usually work out better“. John Mueller, 2018

Some URLs are not that important to Google, some are duplicates, some have conflicting indexation instructions and some pages are low-quality or even spammy.

Low-Quality Content On Part of A Web Site Can Affect Rankings For The Same Website On More Important Keyword Rankings

Moz has a good video on the Google organic quality score theory. You should watch it. It goes into a lot of stuff I (and others) have been blogging for the last few year.

One thing that could have been explained better in the video was that Moz has topical authority world wide for ‘Google SEO’ terms, hence why they can rank so easily for ‘organic quality score’.

But the explanation of the quality score is a good introduction for beginners.

I am in the camp this organic quality score has been in place for a long time, and more and more are feeling the results from it.

This is also quite relevant to a question answered last week in the Google Webmaster Hangout which was:

“QUESTION – Is it possible that if the algorithm doesn’t particularly like our blog articles as much that it could affect our ranking and quality score on the core Content?”

resulting in an answer:

“ANSWER: JOHN MUELLER (GOOGLE): Theoretically, that’s possible. I mean it’s kind of like we look at your web site overall. And if there’s this big chunk of content here or this big chunk kind of important wise of your content, there that looks really iffy, then that kind of reflects across the overall picture of your website. But I don’t know in your case, if it’s really a situation that your blog is really terrible.”

Google has introduced (at least) a ‘percieved’ risk to publishing lots of lower-quality pages on your site to in an effort to curb production of old-style SEO friendly content based on manipulating early search engine algorithms.

We are dealing with algorithms designed to target old style SEO – that focus on the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of Keywords equals LOTS of Google traffic.

A big site can’t just get away with publishing LOTS of lower quality content in the cavalier way they used to – not without the ‘fear’ of primary content being impacted and organic search traffic throttled negatively to important pages on the site.

Google is very probably using user metrics in some way to determine the ‘quality’ of your site.

QUESTION – “I mean, would you recommend going back through articles that we posted and if there’s ones that we don’t necessarily think are great articles, that we just take them away and delete them?”

The reply was:

JOHN MUELLER: I think that’s always an option.Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.

Deleting content is not always the optimal way to handle MANY types of low-quality content – far from it, in fact. Nuking it is the last option unless the pages really are ‘dead‘ content.

Any clean-up should go hand in hand with giving Google something it is going to value on your site e.g. NEW high-quality content:

Screenshot 2016-05-25 19.32.25

The final piece of advice is interesting, too.

It gives us an insight into how Google might actually deal with your site:

JOHN MUELLER: “Then maybe that’s something where you can collect some metrics and say, well, everything that’s below this threshold, we’ll make a decision whether or not to significantly improve it or just get rid of it.”

E.g. (paraphrasing!)

You can probably rely on Google to ‘collect some metrics and say, well, everything that’s below this threshold, we’ll “…(insert punishment spread out over time).

Google probably has a quality score of some sort, and your site probably has a rating whatever that is relevant to (and if you get any real traffic from Google, often a manual rating).

If you have a big site, certain parts of your site will be rated more useful than others to Google.

Improving the quality of your content certainly works to improve traffic, as does intelligently managing your content across the site. Positive results from this process are NOT going to happen overnight. I’ve blogged about this sort of thing for many years, now.

Google are going together better at rating sites that meet their guidelines for ‘quality’ and ‘user satisfaction’ here – I am putting such things in quotes here to highlight the slightly Orwellian doublespeak we have to work with.

If you are creating content for blogging purposes, consider the following:

Google says:

QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors.” Google, 2017

There is no one particular way to create web pages that successfully rank in Google but you must ensure:

QUOTE: “that your content kind of stands on its own” John Mueller, Google 2015

If you have an optimised platform on which to publish it, high-quality content is the number 1 user experience area to focus on across websites to earn traffic from Google.

If you have been impacted by Google’s content quality algorithms, your focus should be on ‘improving content’ on your site rather than deleting content:

If you want to learn how to optimise your website content then read on.

When You Add Pages To A Website, “you spread the value across the whole set, and all of them get a little bit less.”

This statement from Google is incredibly important:

QUOTE: “Usually you can’t just arbitrarily expand a site and expect the old pages to rank the same, and the new pages to rank as well. All else being the same, if you add pages, you spread the value across the whole set, and all of them get a little bit less. Of course, if you’re building up a business, then more of your pages can attract more attention, and you can sell more things, so usually “all else” doesn’t remain the same. At any rate, I’d build this more around your business needs — if you think you need pages to sell your products/services better, make them.” John Mueller, Google 2018

When you expand the content on a website, you apparently dilute the ‘value’ you already have, “all else being the same“. This is a good statement to have. I have long thought this was the direction we were heading in.

This certainly seems to be an ‘answer‘ to ‘domain authority‘ abuse and ‘doorway page‘ abuse. It is also going to make webmasters think twice about the type of “SEO friendly content” they publish.

If your pages are low-quality, and you add more pages to your site of a similar quality, then your overall low-quality value Google assigns to your website is lowered still. That value is probably based on content relevance and E.A.T. (high-quality backlinks and expertise) and that is generally a measure of how well you deliver for Google visitors.

We already know that low-quality content on one part of a website can impact rankings for other keywords on other (even high-quality) pages on the same website. I go into this later.

When making ‘SEO-friendly’ content you need to ask yourself:

  • Is this new ‘SEO-friendly’ content going to pick up links? Do it.
  • Is this new content going to be useful for your current website visitors? Do it, carefully.
  • Is this new content autogenerated across many pages? Think twice. Expect value to be diluted in some way.
  • Does the site already have lots of low-quality content? Deal with that, first, by improving it. Get rid of stale and irrelevant content.
  • Do I have the keywords I want to rank for on the site? Get those on pages you already have without keyword stuffing and prepare to make a high-quality site PLUS high-quality content (as these are 2 different factors) to target specific keyword phrases.

I’ve advised clients for a long time it is incredibly worth investing in some higher-quality long-form in-depth content for a few pages on a site. It is a great way to get new links AND to add keywords to your site in a way that is useful to Google algorithms and human visitors and without the risk of adding low-quality pages to your site.

This concept is a bit like a leaky or reversed version of Pagerank applied ON-SITE. In the original patent, I believe Pages did not ‘lose’ PR (in a general sense), they ‘donated’ PR to other pages in the ‘set’. More pages used to result in more PR. If we think ‘value’ is the new PR score, then the more pages you add to your website (all else remaining the same) the primary way Google measures your site ‘score’ means that that quality score is coming down.

You also need to ask yourself WHY Google would rank 10,000 or 100,000 of your pages for free, just because you tweaked a few keywords? Herein lies the answer to many ranking and indexing problems.

TAKEAWAY: Don’t add lots of pages to your site where the single purpose is to meet known Google keyword relevance signals. You may get de-valued for it. Do add content to your site that has a clear purpose for your users.

Is Your Content ‘useful for Google users‘?

QUOTE: “Our Webmaster Guidelines advise you to create websites with original content that adds value for users.” Google, 2018

Content quality is one area to focus on if you are to avoid demotion in Google.

QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2016

This article aims to cover the most significant challenges of writing ‘SEO-friendly’ text and web page copy for Google. High-quality content is one aspect of a high-quality page on a high-quality site.

SEO is NO LONGER about adding keywords to pages with 300 words of text. In fact, that practice can be toxic to a site.

Your content needs to be useful to Google users.

If you run an affiliate website or have content that appears on other sites, this is even more important.

QUOTE: “This is particularly important for sites that participate in affiliate programs. Typically, affiliate websites feature product descriptions that appear on sites across that affiliate network. As a result, sites featuring mostly content from affiliate networks can suffer in Google’s search rankings, because they do not have enough added value content that differentiates them from other sites on the web.” Google, 2018

and

QUOTE: “Google believes that pure, or “thin,” affiliate websites do not provide additional value for web users, especially (but not only) if they are part of a program that distributes its content across a network of affiliates. These sites often appear to be cookie-cutter sites or templates the same or similar content replicated within the same site, or across multiple domains or languages. Because a search results page could return several of these sites, all with the same content, thin affiliates create a frustrating user experience.”  Google, 2018

An example of a ‘thin-affiliate‘ site is a site where “product descriptions and reviews are copied directly from the original merchant without any original content or added value” and “where the majority of the site is made for affiliation and contains a limited amount of original content or added value for users”.

Google says that “Good affiliates add value, for example by offering original product reviews, ratings, navigation of products or categories, and product comparisons“.

Google offers us the following advice when dealing with sites with low-value content:

  • QUOTE: “Affiliate program content should form only a minor part of the content of your site if the content adds no additional features.” Google, 2018

  • QUOTE: “Ask yourself why a user would want to visit your site first rather than visiting the original merchant directly. Make sure your site adds substantial value beyond simply republishing content available from the original merchant.” Google, 2018

  • QUOTE: “The more targeted the affiliate program is to your site’s content, the more value it will add and the more likely you will be to rank better in Google search results.”Google, 2018

  • QUOTE: “Keep your content updated and relevant. Fresh, on-topic information increases the likelihood that your content will be crawled by Googlebot and clicked on by users.”Google, 2018

 

Google Demotion Algorithms Target Low-Quality Content

Optimising (without improving) low-quality content springs traps set by ever-improving core quality algorithms.

What this means is that ‘optimising’ low-quality pages is very much swimming upstream.

Optimising low-quality pages without value-add is self-defeating, now that the algorithms – and manual quality rating efforts –  have got that stuff nailed down.

If you optimise low-quality pages using old school SEO techniques, you will be hit with a low-quality algorithm (like the Quality Update or Google Panda).

You must avoid boilerplate text, spun text or duplicate content when creating pages – or you are Panda Bamboo – as Google hinted at in the 2015 Quality Rater’s Guide.

QUOTE: “6.1 Low Quality Main Content One of the most important considerations in PQ rating is the quality of the MC. The quality of the MC is determined by how much time, effort, expertise, and talent/skill have gone into the creation of the page. Consider this example: Most students have to write papers for high school or college. Many students take shortcuts to save time and effort by doing one or more of the following:

  • Buying papers online or getting someone else to write for them
  • Making things up.
  • Writing quickly with no drafts or editing.
  • Filling the report with large pictures or other distracting content.
  • Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there.
  • Using commonly known facts, for example, “Argentina is a country. People live in Argentina. Argentina has borders. Some people like Argentina.”
  • Using a lot of words to communicate only basic ideas or facts, for example, “Pandas eat bamboo. Pandas eat a lot of bamboo. It’s the best food for a Panda bear.

Unfortunately, the content of some webpages is similarly created. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well. Important: Low quality MC is a sufficient reason to give a page a Low quality rating.”

Google rewards uniqueness or punishes the lack of it.

The number 1 way to do ‘SEO copywriting‘ will be to edit the actual page copy to continually add unique content and improve its accuracy, uniqueness, relevance, succinctness, and use.

Low-Quality Content Is Not Meant To Rank High in Google.

Page Quality Score From Google Search Quality Rater's Guidelines

A Google spokesman said not that long ago that Google Panda was about preventing types of sites that shouldn’t rank for particular keywords from ranking for them.

QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.”  Gary Illyes – Search Engine Land, 2017

When Google demotes your page for duplicate content practices, and there’s nothing left in the way of unique content to continue ranking you for – your web pages will mostly be ignored by Google.

The way I look at it – once Google strips away all the stuff it ignores (duplicate text) – what’s left? In effect, that’s what you can expect Google to reward you for. If what is left is boilerplate synonymised text content – that’s now being classed as web spam – or ‘spinning’.

NOTE – The ratio of duplicate content on any page is going to hurt you if you have more duplicate text than unique content. A simple check of the pages, page to page, on the site is all that’s needed to ensure each page is DIFFERENT (regarding text) page-to-page.

If you have large sections of duplicate text page-to-page – that is a problem that should be targeted and removed.

It is important to note:

  1. The main text content on the page must be unique to avoid Google’s page quality algorithms.
  2. Verbose text must NOT be created or spun automatically
  3. Text should NOT be optimised to a template as this just creates a footprint across many pages that can be interpreted as redundant or manipulative boilerplate text.
  4. Text should be HIGHLY descriptive, unique and concise
  5. If you have a lot of pages to address, the main priority is to create a UNIQUE couple of paragraphs of text, at least, for the MC (Main Content). Pages do not need thousands of words to rank. They just need to MEET A SPECIFIC USER INTENT and not TRIP ‘LOW_QUALTY’ FILTERS.  A page with just a few sentences of unique text still meets this requirement (150-300 words) – for now.
  6. When it comes to out-competing competitor pages, you are going to have to look at what the top competing page is doing when it comes to main content text. Chances are – they have some unique text on the page. If they rank with duplicate text, either their SUPPLEMENTARY CONTENT is better, or the competitor domain has more RANKING ABILITY because of either GOOD BACKLINKS or BETTER USER EXPERIENCE.
  7. Updating content on a site should be a priority as Google rewards fresher content for certain searches.

 

Help Google Help You Index More Pages

This is no longer about repeating keywords. ANYTHING you do to IMPROVE the page is going to be a potential SEO benefit. That could be:

  • creating fresh content
  • removing doorway-type pages
  • cleaning up or removing thin-content on a site
  • adding relevant keywords and key phrases to relevant pages
  • constantly improving pages to keep them relevant
  • fixing poor grammar and spelling mistakes
  • adding synonyms and related key phrases to text
  • reducing keyword stuffing
  • reducing the ratio of duplicated text on your page to unique text
  • removing old outdated links or out-of-date content
  • rewording sentences to take out sales or marketing fluff and focusing more on the USER INTENT (e.g. give them the facts first including pros and cons – for instance – through reviews) and purpose of the page.
  • merging many old stale pages into one, fresh page, which is updated periodically to keep it relevant
  • Conciseness, while still maximising relevance and keyword coverage
  • Improving important keyword phrase prominence throughout your page copy (you can have too much, or too little, and it is going to take testing to find out what is the optimal presentation will be)
  • Topic modelling

A great writer can get away with fluff but the rest of us probably should focus on being concise.

Low-quality fluff is easily discounted by Google these days – and can leave a toxic footprint on a website.

Minimise the production of doorway-type pages you produce on your site

You will need another content strategy. If you are forced to employ these type of pages, you need to do it in a better way.

QUOTE: “if you have a handful of locations and you have unique valuable content to provide for those individual locations I think providing that on your website is perfectly fine if you have hundreds of locations then putting out separate landing pages for every city or every region is almost more like creating a bunch of doorway pages so that’s something I really discourage” John Mueller Google 2017

Are you making ‘doorway pages’ and don’t even know it?

QUOTE: “Google algorithms consistently target sites with doorway pages in quality algorithm updates. The definition of a “doorway page” can change over time.” Shaun Anderson, Hobo 2020

Minimise the production of thin pages you produce on your site

You will need to check how sloppy your CMS is. Make sure it does not inadvertently produce pages with little to no unique content on them (especially if you have ads on them).

QUOTE: “John says to avoid lots of “just automatically generated” pages and if these are pages that are not automatically generated, then I wouldn’t see that as web spam.” Conversely then “automatically generated” content = web spam? It is evident Googlebot expects to see a well formed 404 if no page exists at a url.” Shaun Anderson, Hobo 2020

Create Useful 404 Pages

I mentioned previously that Google gives clear advice on creating useful 404 pages:

  1. Tell visitors clearly that the page they’re looking for can’t be found
  2. Use language that is friendly and inviting
  3. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site.
  4. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page.
  5. Think about providing a way for users to report a broken link.
  6. Make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested

It is incredibly important to create useful and proper 404 pages. This will help prevent Google recording lots of autogenerated thin pages on your site (both a security risk and a rankings risk).

I sometimes use 410 responses for expired content that is never coming back.

A poor 404 page and user interaction with it, can only lead to a ‘poor user experience’ signal at Google’s end, for a number of reasons. I will highlight a poor 404 page in my audits and actually programmatically look for signs of this issue when I scan a site. I don’t know if Google looks at your site that way to rate it e.g. algorithmically determines if you have a good 404 page – or if it is a UX factor, something to be taken into consideration further down the line – or purely to get you thinking about 404 pages (in general) to help prevent Google wasting resources indexing crud pages and presenting poor results to searchers. I think rather that any rating would be a second order scoring including data from user activity on the SERPs – stuff we as SEO can’t see.

At any rate – I don’t need to know why we need to do something, exactly, if it is in black and white like:

QUOTE: “Create useful 404 pages Google, 2018

and

QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018

….. all that is need doing is to follow the guideline as exact as Google tells you to do it.

Ratings for Pages with Error Messages or No MC

Google doesn’t want to index pages without a specific purpose or sufficient main content. A good 404 page and proper setup prevents a lot of this from happening in the first place.

QUOTE: “Some pages load with content created by the webmaster, but have an error message or are missing MC. Pages may lack MC for various reasons. Sometimes, the page is “broken” and the content does not load properly or at all. Sometimes, the content is no longer available and the page displays an error message with this information. Many websites have a few “broken” or non-functioning pages. This is normal, and those individual non-functioning or broken pages on an otherwise maintained site should be rated Low quality. This is true even if other pages on the website are overall High or Highest quality.” Google

Does Google programmatically look at 404 pages?

We are told, NO in a recent hangout –  – but – in Quality Raters Guidelines “Users probably care a lot”.

Do 404 Errors in Search Console Hurt My Rankings?

QUOTE: “404 errors on invalid URLs do not harm your site’s indexing or ranking in any way.” John Mueller, Google

It appears this isn’t a once size fits all answer. If you properly deal with mishandled 404 errors that have some link equity, you reconnect equity that was once lost – and this ‘backlink reclamation’ evidently has value.

The issue here is that Google introduces a lot of noise into that Crawl Errors report to make it unwieldy and not very user-friendly.

A lot of broken links Google tells you about can often be totally irrelevant and legacy issues. Google could make it instantly more valuable by telling us which 404s are linked to from only external websites.

Fortunately, you can find your own broken links on-site using the myriad of SEO tools available.

I also prefer to use Analytics to look for broken backlinks on a site with some history of migrations, for instance.

John has clarified some of this before, although he is talking specifically (I think) about errors found by Google Search Console:

  1. In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How do you tell? Double-check the origin of the crawl error. If there’s a broken link on your site, in your page’s static HTML, then that’s always worth fixing
  2. What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important

If you are making websites and want them to rank, the Quality Raters Guidelines document is a great guide for Webmasters to avoid low-quality ratings and potentially avoid punishment algorithms.

Block your internal search function on your site

QUOTE: “Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date.” Google, 2017

This will prevent the automatic creation of thin pages and could help prevent against negative SEO attacks, too.

QUOTE: “A robots.txt file is a file on your webserver used to control bots like Googlebot, Google’s web crawler. You can use it to block Google and Bing from crawling parts of your site.” Shaun Anderson, Hobo 2020

CAVEAT: Unless the site is actually built around an extremely well designed architecture. I’ve never came across this, but I see some very well organised big-brands do it. If is not broken do not fix it.

Use canonicals properly

QUOTE: “If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. (This is called “canonicalization”.)” Google, 2007

This will help consolidate signals in the correct pages.

QUOTE: “The canonical link element is extremely powerful and very important to include on your page. Every page on your site should have a canonical link element, even if it is self referencing. It’s an easy way to consolidate ranking signals from multiple versions of the same information. Note: Google will ignore misused canonicals given time.” Shaun Anderson, Hobo 2020

Use proper pagination control on paginated sets of pages

This will help with duplicate content issues.

QUOTE: “Use rel="next" and rel="prev" links to indicate the relationship between component URLs. This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page. Google

What you do to handle paginated content will depend on your circumstances.

QUOTE: “Paginated pages are not duplicate content, but often, it would be more beneficial to the user to land on the first page of the sequence. Folding pages in a sequence and presenting a canonical URL for a group of pages has numerous benefits. Shaun Anderson, Hobo 2020

Use proper indexation control on pages

Some pages your site may require to have a meta noindex on them.

Identify your primary content assets and improve them instead of optimising low-quality pages (which will get slapped in a future algorithm update).

QUOTE: “Meta tags, when used properly can still be useful in a number of areas outside just ranking pages e.g. to improve click-through rates from the SERP. Abuse them, and you might fall foul of Google’s punitive quality algorithms. Shaun Anderson, Hobo 2020

How To Deal With Search Console Indexation Report Errors

How to deal with “Submitted URL marked ‘noindex’” and “Excluded by ‘noindex’ tag” notifications in Search console

Why are you creating pages and asking Google to noindex them? There is always a better way than to noindex pages. Review the pages you are making and check they comply with Google guidelines e.g. are not doorway pages. Check if technically there is a better way to handle noindexed pages.

How to handle “Page with redirect” notifications in Search console

Why do you have URLs in your sitemap that are redirects? This is not ideal. Review and remove the redirects from the sitemap.

What does “ Indexed, not submitted in sitemap” mean in Search Console?
It means Google has crawled your site and found more pages than you have in your sitemap. Depending on the number of pages indicated, this could be a non-issue or a critical issue.
Make sure you know the type of pages you are attempting to get indexed, the page types your CMS produces.

How to deal with “Duplicate page without canonical tag”,  “Alternate page with proper canonical tag” and “Submitted URL not selected as canonical” notifications in Search console

Review how you use canonical link elements throughout the site.

How To Deal with “Crawl anomaly” notifications in search console:

QUOTE: “An unspecified anomaly occurred when fetching this URL. This could mean a 4xx- or 5xx-level response code; try fetching the page using Fetch as Google to see if it encounters any fetch issues. The page was not indexed.” Google, 2018

How To Deal With Crawled – currently not indexed:

QUOTE: “The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.” Google, 2018

These could be problematic. You should check to see if pages you want indexed are included in this list of URLs. If they are, this could be indicative of a page quality issue.

Read this official article a full list of new features in the Google Search Console Indexation Report,

On-Site; Do I Need A Google XML Sitemap For My Website?

QUOTE:  “(The XML Sitemap protocol) has wide adoption, including support from Google, Yahoo!, and Microsoft.

No. You do NOT, technically, need an XML Sitemap to optimise a site for Google if you have a sensible navigation system that Google can crawl and index easily.

QUOTE: “Some pages we crawl every day. Other pages, every couple of months.” John Mueller, Google

Some pages are more important than others to Googlebot.

HOWEVER you should have a Content Management System that produces one as a best practice – and you should submit that sitemap to Google in Google Search Console. Again – best practice.

Google has said very recently XML and RSS are still a very useful discovery method for them to pick out recently updated content on your site.

QUOTE: “All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break your list into multiple sitemaps.” Google Webmaster Guidelines 2020

An XML Sitemap is a file on your server with which you can help Google easily crawl & index all the pages on your site. This is evidently useful for very large sites that publish lots of new content or updates content regularly.

Your web pages will still get into search results without an XML sitemap if Google can find them by crawling your website if you:

  1. Make sure all your pages link to at least one other in your site
  2. Link to your important pages often, with (varying anchor text, in the navigation and in page text content if you want best results)

Remember – Google needs links to find all the pages on your site, and links spread Pagerank, that help pages rank – so an XML sitemap is never a substitute for a great website architecture.

QUOTE: “Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.” Sitemaps.org, 2020

Most modern CMS auto-generate XML sitemaps and Google does ask you submit a site-map in Google Search Console, and I do these days.

I prefer to define manually my important pages by links and depth of content, but an XML sitemap is a best practice for most sites.

QUOTE: “We support 50 megabytes for a sitemap file, but not everyone else supports 50 megabytes. Therefore, we currently just recommend sticking to the 10 megabyte limit,” John Mueller, Google

Google wants to know when primary page content is updated, not when supplementary page content is modified – if the content significantly changes, that’s relevant. If the content, the primary content, doesn’t change,then I wouldn’t update it.

Does Google crawl an XML sitemap and does it crawl the entire sitemap once it starts?

A question was asked in a recent Google Hangout by someone with a website indexation problem:

QUOTE: “How often does Google crawl an XML sitemap and does it crawl the entire sitemap once it starts?”

An XML sitemap is inclusive, not exclusive.

QUOTE: “sitemap files do help us to better understand a website and to better figure out which parts of website need to be recrawled so specifically if you have information in like the last modification date that really helps us to figure out which of these pages are new or have changed that need to be recrawled.” John Mueller Google, 2017

There will be URLs on your site that are not in the XML sitemap that Google will crawl and index. There are URLs in your XML sitemap that Google will probably crawl and not index.

QUOTE: “if you’re looking at the sitemap files in search console you have information on how many URLs are indexed from those sitemap files the important part there is that we look at exactly the URL that you list in the sitemap file so if we index the URL with a different parameter or with different upper or lower case or a slash at the end or not then all of that matters for for that segment file so that that might be an issue to kind of look out there” John Mueller 2017

and

QUOTE: “in the sitemap file we primarily focus on the last modification date so that’s that’s what we’re looking for there that’s where we see that we’ve crawled this page two days ago and today it has changed therefore we should recrawl it today we don’t use priority and we don’t use change frequency in the sitemap file at least at the moment with regards to crawling so I wouldn’t focus too much on priority and change frequency but really on the more factual last modification date information an RSS feed is also a good idea with RSS you can use pubsubhubbub which is a way of getting your updates even faster to Google so using pubsubhubbub is probably the fastest way to to get content where you’re regularly changing things on your site and you want to get that into Google as quickly as possible an RSS feed with pubsubhubbub is is a really fantastic way to get that done.” John Mueller Google 2017

and

QUOTE: “so a [XML] sitemap file helps us to understand which URLs on your website are new or have recently changed so in the second file you can specify a last modification date and with that we can kind of judge as we need to crawl next to make sure that we’re not like behind in keeping up with your website’s indexing so if you have an existing website and you submit a sitemap file and the sitemap file has realistic change dates on there then in an ideal case we would look at that and say oh we know about most of these URLs and here are a handful of URLs that we don’t know about so we’ll go off and crawl those URLs it’s not the case that submitting a sitemap file will replace our normal crawling it essentially just adds to the existing crawling that we do“. John Mueller 2017

Can I put my sitemap file into separate smaller files? Yes.

QUOTE: “Another thing that sometimes helps is to split the sitemap files up into separate chunks of logical chunks for your website so that you understand more where pages are not being indexed and then you can see are the products not being indexed or the categories not being indexed and then you can drill down more and more and figure out where where there might be problems that said we don’t guarantee indexing so just because a sitemap file has a bunch of URLs and it doesn’t mean that we will index all of them that’s still kind of something to keep in mind but obviously you can try to narrow things down a little bit and see where where you could kind of improve that situation.” John Mueller, 2017

The URL is naturally important in an XML sitemap. The only other XML sitemap you should really be concerned about is the DATE LAST MODIFIED. You can ignore the FREQUENCY attribute:

QUOTE“we don’t use that at all ….no we only use the date in the [XML] sitemap file “ John Mueller, Google 2017

How many times a week is the index status data in search console updated?

It is updated 2-3 times a week.

Should you use sitemaps with last modified for expired content?

Expired pages can be picked up quickly if you use a last modified date

Why Doesn’t Google Crawl and Index My Website XML Sitemap Fully?

QUOTE: “So we don’t guarantee indexing. So just because something is in a sitemap file isn’t a guarantee that we will actually index it. It might be completely normal that we don’t actually index all of those pages… that even if you do everything technically correct there’s no guarantee that we will actually index everything.” John Mueller, 2018

I have looked at a lot of sites with such indexation problems. In my experience the most common reasons for poor indexation levels of a sitemap on a site with thousands or millions of pages are:

  1. doorway pages
  2. thin pages

Pages that are almost guaranteed to get into Google’s index have one common feature: They have unique content on them.

In short, if you are building doorway type pages without unique content on them, Google won’t index them all properly. If you are sloppy, and also produce thin pages on the site, Google won’t exactly reward that behaviour either.

QUOTE: “with regards to product pages not being indexed in Google again that’s something where maybe that’s essentially just working as intended where we just don’t index everything from them from any website. I think for most websites if you go into the sitemap section or the indexing section you’ll see that we index just a fraction of all of the content on the website. I think for any non-trivial sized website indexing all of the content would be a very big exception and I would be very surprised to see that happen.” John Mueller, Google 2018

Google rewards a smaller site with fat, in-depth pages a lot more than a larger site with millions of thinner pages.

Perhaps Google can work out how much unique text a particular site has on it and weighs that score with the number of pages the site produces. Who knows.

The important takeaway is ‘Why would Google let millions of your auto-generated pages rank, anyway?”

QUOTE: “really create something useful for users in individual locations maybe you do have some something unique that you can add there that makes it more than just a doorway page“. John Mueller, Google 2017

Google Not Indexing URLs In Your Sitemap? Creating New Sitemaps Won’t Help

It is unlikely that modifying your XML sitemaps alone will result in more pages on your site being indexed if the reason the URLs are not indexed in the first place is quality-related:

QUESTION: “I’ve 100 URLs in a xml sitemap. 20 indexed and 80 non-indexed. Then I uploaded another xml sitemap having non-indexed 80 URLs. So same URL’s in multiple sitemap. . . Is it a good practice? Can it be harmful or useful for my site?”

and from Google:

QUOTE: “That wouldn’t change anything. If we’re not indexing 80 URLs from a normal 100 URL site, that sounds like you need to work on the site instead of on sitemap submissions. Make it awesome! Make every page important!” John Mueller, 2018

Most links in your XML Sitemap should be Canonical and not redirects

Google wants final destination URLs and not links that redirect to some other location.

QUOTE: “in general especially, for landing pages…. we do recommend to use the final destination URL in the sitemap file a part of the reason for that is so that we can report explicitly on those URLs in search console …. and you can look at the indexing information just for that sitemap file and that’s based on the exact URLs that you have there. The other reason we recommend doing that is that we use a sitemaps URLs as a part of trying to understand which URL should be the canonical for a piece of content so that is the URL that we should show in the search results and if the sitemap file says one URL and it redirects to a different URL then you’re giving us kind of conflicting information.” John Mueller, Google, 2018

and

QUOTE: “actually the date the last modification date of some of these URLs because with that date we can figure out do we need to recall these URLS to figure out what is new or what is different on these URLs or are these old URLs that basically we might already know about we decided not to index so what I would recommend doing there is creating an XML sitemap file with the dates with the last modification dates just to make sure that Google has all of the information that it can get.” John Mueller, Google, 2018

Read my article on managing redirects properly on a site.

Sometime non-canonical versions of your URLs are indexed instead

QUOTE: “I would recommend doing there is double-checking the URLs themselves and double-checking how they’re actually indexed in Google so it might be that we don’t actually index the URL as you listed in the sitemap file but rather a slightly different version that is perhaps linked in within your website so like I mentioned before the trailing slash is very common or ducked up the non WWW(version) – all of those are technically different URLs and we wouldn’t count that for the sitemap as being indexed if we index it with a slightly different URL.” John Mueller, Google 2018

It is ‘normal’ for Google NOT to index all the pages on your site.

What Is the maximum size limit of an XML Sitemap?

QUOTE: “We support 50 megabytes for a sitemap file, but not everyone else supports 50 megabytes. Therefore, we currently just recommend sticking to the 10 megabyte limit,“ John Mueller, Google 2014

Google wants to know when primary page content is updated, not when supplementary page content is modified – if the content significantly changes, that’s relevant. If the content, the primary content, doesn’t change,then I wouldn’t update it.

Why Is The Number Of Indexed URLs in Search Console Dropping?

Google has probably worked out you are creating doorway-type pages with no-added-value.

QUOTE: “The Panda algorithm may continue to show such a site for more specific and highly-relevant queries, but its visibility will be reduced for queries where the site owner’s benefit is disproportionate to the user’s benefit. Google

Page Quality & Site Quality

Google measures quality on a per page basis and also look at the site overall (with site-wide quality being affected by the quality of individual pages.

Do no indexed pages have an impact on site quality?

Only indexable pages have an impact on site quality. You can use a noindex on low-quality pages if page quality cannot be improved.

QUOTE: “If you if you have a website and you realize you have low-quality content on this website somewhere then primarily of course we’d recommend increasing the quality of the content if you really can’t do that if there’s just so much content there that you can’t really adjust yourself if it’s user-generated content all of these things then there there might be reasons where you’d say okay I’ll use a no index for the moment to make sure that this doesn’t affect the bigger picture of my website.” John Mueller, Google 2017

You should only be applying noindex to pages as a temporary measure at best.

Google wants you to improve the content that is indexed to improve your quality scores.

Identifying Which Pages On Your Own Site Hurt Or Help Your Rankings

Screenshot 2015-11-18 05.03.08

Separating the wheat from the chaff.

Being ‘indexed’ is important. If a page isn’t indexed, the page can’t be returned by Google in Search Engine Results Pages.

While getting as many pages indexed in Google was historically a priority for an SEO, Google is now rating the quality of pages on your site and the type of pages it is indexing.

So bulk indexation is no guarantee of success – in fact, it’s a risk to index all URLs on your site, especially if you have a large, sprawling site.

If you have a lot of low-quality pages (URLs) indexed on your site compared to high-quality pages (URLs)… Google has told us it is marking certain sites down for that.

Some URLs are just not welcome to be indexed as part of your website content anymore.

Do I need to know which pages are indexed?

No. Knowing is useful, of course, but largely unnecessary. Indexation is never a guarantee of traffic.

Some SEO would tend to scrape Google to get indexation data on a website. I’ve never bothered with that. Most sites I work with have XML sitemap files, so an obvious place to start to look at such issues is Google Search Console.

Google will tell you how many pages you have submitted in a sitemap, and how many pages are indexed. It will not tell you which pages are indexed, but if there is a LARGE discrepancy between SUBMITTED and INDEXED, it’s very much worth digging deeper.

Screenshot 2015-11-18 13.10.43

If Google is de-indexing large swaths of your content that you have actually submitted as part of an XML sitemap, then a problem is often afoot.

Unfortunately, with this method, you don’t get to see the pages produced by the CMS out with the XML sitemap – so this is not a full picture of the ‘health’ of your website.

QUOTE: “Make sure Google can crawl your website, index and rank all your primary pages by only serving Googlebot high-quality, user friendly and fast loading pages to index.” Shaun Anderson, Hobo 2020

Identifying Dead Pages

I usually start with a performance analysis that involves merging data from a physical crawl of a website with analytics data and Google Search Console data. A content type analysis will identify the type of pages the cms generates. A content performance analysis will gauge how well each section of the site performs.

If you have 100,000 pages on a site, and only 1,000 pages get organic traffic from Google over a 3-6 month period – you can make the argument 99% of the site is rated as ‘crap’ (at least as far as Google rates pages these days).

I group pages like these together as ‘dead pages‘ for further analysis. Deadweight, ‘dead’ for short.

The thinking is if the pages were high-quality, they would be getting some kind of organic traffic.

Identifying which pages receive no organic visitors over a sensible timeframe is a quick if noisy, way to separate pages that obviously WORK from pages that DONT – and will help you clean up a large portion of redundant URLs on the site.

It helps to see page performance in the context of longer timeframes as some types of content can be seasonal, for instance, and produce false positives over a shorter timescale. It is important to trim content pages carefully – and there are nuances.

False Positives

Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.

False positives aside, once you identify the pages receiving no traffic, you very largely isolate the type of pages on your site that Google doesn’t rate – for whatever reason. A strategy for these pages can then be developed.

Identifying Content That Can Potentially Hurt Your Rankings

As you review the pages, you’re probably going to find pages that include:

  • out of date, overlapping or irrelevant content
  • collections of pages not paginated properly
  • indexable pages that shouldn’t be indexed
  • stub pages
  • indexed search pages
  • pages with malformed HTML and broken images
  • auto-generated pages with little value

You will probably find ‘dead’ pages you didn’t even know your cms produced (hence why an actual crawl of your site is required, rather than just working from a lit of URLs form an XML sitemap, for instance).

Those pages need to be cleaned up, Google has said. And remaining pages should:

QUOTE: “stand on their own” John Mueller, Google

Google doesn’t approve of most types of auto-generated pages so you don’t want Google indexing these pages in a normal fashion.

Judicious use of ‘noindex,follow‘ directive in robots meta tags, and sensible use of the canonical link element are required implementation on most sites I see these days.

The pages that remain after a URL clear-out, can be reworked and improved.

In fact – they MUST BE improved if you are to win more rankings and get more Google organic traffic in future.

This is time-consuming – just like Google wants it to be. You need to review DEAD pages with a forensic eye and ask:

  • Are these pages high-quality and very relevant to a search term?
  • Do these pages duplicate content on the pages on the site?
  • Are these pages automatically generated, with little or no unique text content on them?
  • Is the purpose of this page met WITHOUT sending visitors to another page e.g. doorway pages?
  • Will these pages ever pick up natural links?
  • Is the intent of these pages to inform first? ( or profit from organic traffic through advertising?)
  • Are these pages FAR superior to the competition in Google presently for the search term you want to rank? This is actually very important.

If the answer to any of the above is NO – then it is imperative you take action to minimise the amount of these types of pages on your site.

What about DEAD pages with incoming backlinks or a lot of text content?

Bingo! Use 301 redirects (or use canonical link elements) to redirect any asset you have with some value to Googlebot to equivalent, up to date sections on your site. Do NOT just redirect these pages to your homepage.

Rework available content before you bin it

Screenshot 2016-05-25 19.32.25

High-quality content is expensive – so rework content when it is available. Medium quality content can always be made higher quality – in fact – a page is hardly ever finished. EXPECT to come back to your articles every six months to improve them to keep them moving in the right direction.

Sensible grouping of content types across the site can often leave you with substantial text content that can be reused and repackaged in a way that the same content originally spread over multiple pages, now consolidated into one page reworked and shaped around a topic, has a considerably much more successful time of it in Google SERPs.

Well, it does if the page you make is useful and has a purpose other than just to make money.

REMEMBER – DEAD PAGES are only one aspect of a site review. There’s going to be a large percentage of any site that gets a little organic traffic but still severely underperforms, too – tomorrows DEAD pages. I call these POOR pages in my reviews.

Minimise Low-Quality Content & Overlapping Text Content

Specific Advice From Google on Pruning Content From Your Site

If you have very low-quality site from a content point of view, just deleting the content (or noindexing it) is probably not going to have a massive positive impact on your rankings.

Ultimately the recommendation is to focus on “improving content” as “you have the potential to go further down if you remove that content“.

QUOTE: “Ultimately, you just want to have a really great site people love. I know it sounds like a cliché, but almost [all of] what we are looking for is surely what users are looking for. A site with content that users love – let’s say they interact with content in some way – that will help you in ranking in general, not with Panda. Pruning is not a good idea because with Panda, I don’t think it will ever help mainly because you are very likely to get Panda penalized – Pandalized – because of low-quality content…content that’s actually ranking shouldn’t perhaps rank that well. Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started. I don’t think you are removing content from the site with potential to rank – you have the potential to go further down if you remove that content. I would spend resources on improving content, or, if you don’t have the means to save that content, just leave it there. Ultimately people want good sites. They don’t want empty pages and crappy content. Ultimately that’s your goal – it’s created for your users.” Gary Illyes, Google 2017

Specific Advice From Google On Low-Quality Content On Your Site

And remember the following, specific advice from Google on removing low-quality content from a domain:

******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******

Google may well be able to recognise ‘low-quality’ a lot better than it does ‘high-quality’ – so having a lot of ‘low-quality’ pages on your site is potentially what you are actually going to be rated on (if it makes up most of your site) – now, or in the future. NOT your high-quality content.

This is more or less explained by Google spokespeople like John Mueller. He is constantly on about ‘folding’ thin pages together, these days (and I can say that certainly has a positive impact on many sites).

While his advice in this instance might be specifically about UGC (user-generated content like forums) – I am more interested in what he has to say when he talks about the algorithm looking at the site “overall” and how it ‘thinks’ when it finds a mixture of high-quality pages and low-quality pages:

QUOTE: “For instance, we would see a lot of low-quality posts in a forum. We would index those low-quality pages. And we’d also see a lot of really high-quality posts, with good discussions, good information on those pages. And our algorithms would be kind of stuck in a situation with, well, there’s a lot of low-quality content here, but there’s also a lot of high-quality content here. So how should we evaluate the site overall? And usually, what happens is, our algorithms kind of find some middle ground……. what you’d need to do to, kind of, move a step forward, is really try to find a way to analyze the quality of your content, and to make sure that the high-quality content is indexed and that the lower-quality content doesn’t get indexed by default.” John Mueller, Google 2014

And Google has clearly said in print:

QUOTE: “low-quality content on part of a site can impact a site’s ranking as a whole.” Amit Singhal, Google, 2011

Avoid Google’s punitive algorithms

Fortunately, we don’t actually need to know and fully understand the ins-and-outs of Google’s algorithms to know what the best course of action is.

The sensible thing in light of Google’s punitive algorithms is just to not let Google index (or more accurately, rate) low-quality pages on your site. And certainly – stop publishing new ‘thin’ pages. Don’t put your site at risk.

If pages get no organic traffic anyway, are out-of-date for instance, and improving them would take a lot of effort and expense, why let Google index them normally, if by rating them it impacts your overall score? Clearing away the low-quality stuff lets you focus on building better stuff on other pages that Google will rank and beyond.

Ideally, you would have a giant site and every page would be high-quality – but that’s not practical.

A myth is that pages need a lot of text to rank. They don’t, but a lot of people still try to make text bulkier and unique page-to-page.

While that theory is sound (when focused on a single page, when the intent is to deliver utility content to a Google user) using old school SEO techniques on especially a large site spread out across many pages seems to amplify site quality problems, after recent algorithm changes, and so this type of optimisation without keeping an eye on overall site quality is self-defeating in the long run.

Every site is impacted by how highly Google rates it.

There are many reasons a website loses traffic from Google. Server changes, website problems, content changes, downtimes, redesigns, migrations… the list is extensive.

Sometimes, Google turns up the dial on demands on ‘quality’, and if your site falls short, a website traffic crunch is assured. Some sites invite problems ignoring Google’s ‘rules’ and some sites inadvertently introduce technical problems to their site after the date of a major algorithm update and are then impacted negatively by later refreshes of the algorithm.

Comparing your Google Analytics data side by side with the dates of official algorithm updates is useful in diagnosing a site health issue or traffic drop. In the above example, a new client thought it was a switch to HTTPS and server downtime that caused the drop when it was actually the May 6, 2015, Google Quality Algorithm (originally called Phantom 2 in some circles) that caused the sudden drop in organic traffic – and the problem was probably compounded by unnatural linking practices. (This client did eventually receive a penalty for unnatural links when they ignored our advice to clean up).

Thin Content

A quick check of how the site was laid out soon uncovered a lot of unnecessary pages, or what Google calls thin, overlapping content. This observation would go a long way to confirming that the traffic drop was indeed caused by the May algorithm change.

Another obvious way to gauge the health of a site is to see which pages on the site get zero traffic from Google over a certain period of time. I do this by merging analytics data with crawl data – as analytics doesn’t give you data on pages it sends no traffic to.

Often, this process can highlight low-quality pages on a site.

Google calls a lot of pages ‘thin’ or ‘overlapping’ content these days.

Algorithm Changes

Algorithm changes seem to center on reducing the effectiveness of old-school SEO techniques, with the May 2015 Google ‘Quality’ algorithm update bruisingly familiar. An algorithm change is usually akin to ‘community service’ for the business impacted negatively.

If your pages were designed to get the most out of Google, with commonly known and now outdated SEO techniques chances are Google has identified this and is throttling your rankings in some way. Google will continue to throttle rankings until you clean your pages up.

If Google thinks your links are manipulative, they want them cleaned up, too.

Actually – looking at the backlink profile of this customer, they are going to need a disavow file prepared too.

That is unsurprising in today’s SEO climate.

What could be argued was ‘highly relevant’ or ‘optimised’ on-site SEO for Google just a few years ago is now being treated more like ‘web spam’ by punitive algorithms, rather than just ‘over-optimisation’.

Google went through the SEO playbook and identified old techniques and use them against you today – meaning every SEO job you take on always has a clean up aspect now.

Google has left a very narrow band of opportunity when it comes to SEO – and punishments are designed to take you out of the game for some time while you clean up the infractions.

Technical Issues

Google has a LONG list of technical requirements it advises you meet, on top of all the things it tells you NOT to do to optimise your website. Meeting Google’s technical guidelines is no magic bullet to success – but failing to meet them can impact your rankings in the long run – and the odd technical issue can actually severely impact your entire site if rolled out across multiple pages.

The benefit of adhering to technical guidelines is often a second-order benefit.

You won’t get penalised or filtered when others do. When others fall, you will rise.

Mostly – individual technical issues will not be the reason you have ranking problems, but they still need to be addressed for any second-order benefit they provide.

Google spokespeople say ‘user-experience’ is NOT A RANKING FACTOR but this might be splitting hairs as lots of the rules are designed to guarantee a good a ‘user experience’ as possible for Google’s users.

Most of Google’s technical guidelines can be interpreted in this way. And most need to be followed, whether addressing these issues has an immediate positive impact on the site or not.

Whether or not your site has been impacted in a noticeable way by these algorithms, every SEO project must start with a historical analysis of site performance. Every site has things to clean up and to optimise in a modern way.

The sooner you understand why Google is sending you less traffic than it did last year, the sooner you can clean it up and focus on proactive SEO that starts to impact your rankings in a positive way.

 

What To Avoid

Google has a VERY basic organic search engine optimisation starter guide pdf for webmasters, which they use internally:

QUOTE: “Although this guide won’t tell you any secrets that’ll automatically rank your site first for queries in Google (sorry!), following the best practices outlined below will make it easier for search engines to both crawl and index your content.” Google, 2008

It is still worth a read, even if it is VERY basic, best practices for your site.

No search engine will EVER tell you what actual keywords to put on your site to improve individual rankings or get more converting organic traffic – and in Google – that’s the SINGLE MOST IMPORTANT thing you want to know!

Google specifically mentions something very important in the guide:

QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors discussed here.” Google SEO Starter Guide, 2017

This starter guide is still very useful for beginners.

I do not think there is anything in the 2017 guide that is really useful if you have been working online since the last starter guide was first published (2008) and its first update was announced (2010). It still leaves out some of the more complicated technical recommendations for larger sites.

I usually find it useful to keep an eye on what Google tells you to avoid in such documents, which are:

  • AVOID: “Don’t let your internal search result pages be crawled by Google. Users dislike clicking a search engine result only to land on another search result page on your site.”

  • AVOID: “Allowing URLs created as a result of proxy…”

  • AVOID: “Choosing a title that has no relation to the content on the page.”

  • AVOID: “Using default or vague titles like “Untitled” or “New Page 1″.”

  • AVOID: “Using a single title across all of your site’s pages or a large group of pages.”

  • AVOID: “Using extremely lengthy titles that are unhelpful to users.”

  • AVOID: “Stuffing unneeded keywords in your title tags.”

  • AVOID: “Writing a description meta tag that has no relation to the content on the page.”

  • AVOID: “Using generic descriptions like “This is a web page” or “Page about baseball cards”.”

  • AVOID: “Filling the description with only keywords.”

  • AVOID: “Copying and pasting the entire content of the document into the description meta tag.”

  • AVOID: “Using a single description meta tag across all of your site’s pages or a large group of pages.”

  • AVOID: “Placing text in heading tags that wouldn’t be helpful in defining the structure of the page.”

  • AVOID: “Using heading tags where other tags like <em> and <strong> may be more appropriate.”

  • AVOID: “Erratically moving from one heading tag size to another.”

  • AVOID: “Excessive use of heading tags on a page.”

  • AVOID: “Very long headings.”

  • AVOID: “Using heading tags only for styling text and not presenting structure.”

  • AVOID: “Using invalid ‘Structured Data ‘markup.”

  • AVOID: “Changing the source code of your site when you are unsure about implementing markup.”

  • AVOID: “Adding markup data which is not visible to users.”

  • AVOID: “Creating fake reviews or adding irrelevant markups.”

  • AVOID: “Creating complex webs of navigation links, for example, linking every page on your site to every other page.”

  • AVOID: “Going overboard with slicing and dicing your content (so that it takes twenty clicks to reach from the homepage).”

  • AVOID: “Having a navigation based entirely on images, or animations.”

  • AVOID: “Requiring script or plugin-based event-handling for navigation”

  • AVOID: “Letting your navigational page become out of date with broken links.”

  • AVOID: “Creating a navigational page that simply lists pages without organizing them, for example by subject.”

  • AVOID: “Allowing your 404 pages to be indexed in search engines (make sure that your web server is configured to give a 404 HTTP status code or – in the case of JavaScript-based sites – include a noindex robots meta-tag when non-existent pages are requested).”

  • AVOID: “Blocking 404 pages from being crawled through the robots.txt file.”

  • AVOID: “Providing only a vague message like “Not found”, “404”, or no 404 page at all.”

  • AVOID: “Using a design for your 404 pages that isn’t consistent with the rest of your site.”

  • AVOID: “Using lengthy URLs with unnecessary parameters and session IDs.”

  • AVOID: “Choosing generic page names like “page1.html”.”

  • AVOID: “Using excessive keywords like “baseball-cards-baseball-cards-baseballcards.htm”.”

  • AVOID: “Having deep nesting of subdirectories like “…/dir1/dir2/dir3/dir4/dir5/dir6/page.html”.”

  • AVOID: “Using directory names that have no relation to the content in them.”

  • AVOID: “Having pages from subdomains and the root directory access the same content, for example, “domain.com/page.html” and “sub.domain.com/page.html”.”

  • AVOID: “Writing sloppy text with many spelling and grammatical mistakes.”

  • AVOID: “Awkward or poorly written content.”

  • AVOID: “Embedding text in images and videos for textual content: users may want to copy and paste the text and search engines can’t read it.”

  • AVOID: “Dumping large amounts of text on varying topics onto a page without paragraph, subheading, or layout separation.”

  • AVOID: “Rehashing (or even copying) existing content that will bring little extra value to users.”

  • AVOID: “Having duplicate or near-duplicate versions of your content across your site.”

  • AVOID: “Inserting numerous unnecessary keywords aimed at search engines but are annoying or nonsensical to users.”

  • AVOID: “Having blocks of text like “frequent misspellings used to reach this page” that add little value for users.”

  • AVOID: “Deceptively hiding text from users, but displaying it to search engines.”

  • AVOID: “Writing generic anchor text like “page”, “article”, or “click here”.”

  • AVOID: “Using text that is off-topic or has no relation to the content of the page linked to.”

  • AVOID: “Using the page’s URL as the anchor text in most cases, although there are certainly legitimate uses of this, such as promoting or referencing a new website’s address.”

  • AVOID: “Writing long anchor text, such as a lengthy sentence or short paragraph of text.”

  • AVOID: “Using CSS or text styling that make links look just like regular text.”

  • AVOID: “Using excessively keyword-filled or lengthy anchor text just for search engines.”

  • AVOID: “Creating unnecessary links that don’t help with the user’s navigation of the site.”

  • AVOID: “Using generic filenames like “image1.jpg”, “pic.gif”, “1.jpg” when possible—if your site has thousands of images you might want to consider automating the naming of the images.”

  • AVOID: “Writing extremely lengthy filenames.”

  • AVOID: “Stuffing keywords into alt text or copying and pasting entire sentences.”

  • AVOID: “Writing excessively long alt text that would be considered spammy.”

  • AVOID: “Using only image links for your site’s navigation”.

  • AVOID: “Attempting to promote each new, small piece of content you create; go for big, interesting items.”

  • AVOID: “Involving your site in schemes where your content is artificially promoted to the top….”

  • AVOID: “Spamming link requests out to all sites related to your topic area.”

  • AVOID: “Purchasing links from another site with the aim of getting PageRank”

This is straightforward stuff but sometimes it’s the simple stuff that often gets overlooked. Of course, you combine the above together with the technical recommendations in Google guidelines for webmasters.

Don’t make these simple but dangerous mistakes…..

  • Avoid duplicating content on your site found on other sites. Yes, Google likes content, but it *usually* needs to be well linked to, unique and original to get you to the top!
  • Don’t hide text on your website. Google may eventually remove you from the SERPs.
  • Don’t buy 1000 links and think “that will get me to the top!”. Google likes natural link growth and often frowns on mass link buying.
  • Don’t get everybody to link to you using the same “anchor text” or link phrase. This could flag you as a ‘rank modifier’. You don’t want that.
  • Don’t chase Google PR by chasing 100′s of links. Think quality of links….not quantity.
  • Don’t buy many keyword rich domains, fill them with similar content and link them to your site. This is lazy and dangerous and could see you ignored or worse banned from Google. It might have worked yesterday but it sure does not work today without some grief from Google.
  • Do not constantly change your site pages names or site navigation without remembering to employ redirects. This just screws you up in any search engine.
  • Do not build a site with a JavaScript navigation that Google, Yahoo and Bing cannot crawl.
  • Do not link to everybody who asks you for reciprocal links. Only link out to quality sites you feel can be trusted.
  • Do not spam visitors with ads and pop-ups.

Don’t Flag Your Site With Spam

A primary goal of any ‘rank modification’ is not to flag your site as ‘suspicious’ to Google’s algorithms or their webspam team.

I would recommend you forget about tricks like links in H1 tags etc. or linking to the same page 3 times with different anchor text on one page.

Forget about ‘which is best’ when considering things you shouldn’t be wasting your time with.

Every element on a page is a benefit to you until you spam it.

Put a keyword in every tag and you may flag your site as ‘trying too hard’ if you haven’t got the link trust to cut it – and Google’s algorithms will go to work.

Spamming Google is often counter-productive over the long term.

So:

  • Don’t spam your anchor text link titles with the same keyword.
  • Don’t spam your ALT Tags or any other tags either.
  • Add your keywords intelligently.
  • Try and make the site mostly for humans, not just search engines.

On Page is not as simple as a checklist any more of keyword here, keyword there. We are up against lots of smart folk at the Googleplex – and they purposely make this practice difficult.

For those who need a checklist, this is the sort of one that gets me results;

  • Do keyword research
  • Identify valuable searcher intent opportunities
  • Identify the audience & the reason for your page
  • Write utilitarian copy – be useful. Use related terms in your content. Use plurals and synonyms. Use words with searcher intent like “buy”, “compare”, “hire” etc. I prefer to get a keyword or related term in almost every paragraph.
  • Use emphasis sparingly to emphasise the important points in the page whether they are your keywords are not
  • Pick an intelligent Page Title with your keyword in it
  • Write an intelligent meta description, repeating it on the page
  • Add an image with user-centric ALT attribute text
  • Link to related pages on your site within the text
  • Link to related pages on other sites
  • Your page should have a simple search engine friendly URL
  • Keep it simple
  • Don’t let ads get in the way
  • Share it and pimp it

You can forget about just about everything else.

How Google Treats Subdomains: “We… treat that more as a single website”

John Mueller was asked if it is ok to interlink sub-domains, and the answer is yes, usually, because Google will treat subdomains on a website (primarily I think we can presume) as “a single website“:

QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017

That is important if Google has a website quality rating score (or multiple scores) for websites.

Personally I prefer subdirectories rather than subdomains if given the choice, unless it really makes sense to house particular content on a subdomain, rather than the main site (as in the examples John mentions).

Back in the day when Google Panda was released, many tried to run from Google’s content quality algorithms by moving to ‘penalised’ pages and content to subfolders. I thought that was a temporary solution.

Over the long term, you can, I think, expect Google to treat subdomains on most common use websites as one entity – if it is – and be ranked accordingly in terms of content quality and user satisfaction.

Should I Choose a Subfolder or Subdomain?

If you have the choice, I would choose to house content on a subfolder on the main domain. Recent research would still indicate this is the best way to go:

QUOTE: “When you move from Subdomain to Subdirectory, you rank much better, and you get more organic traffic from Google.” Sistrix, 2018

Subfolders or Subdomains for google seo?: Observations indicate subfolders.

A Continual Evolution

QUOTE: “One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before. The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them.” Danny Sullivan, Google 2020

The ‘Keyword Not Provided‘ incident (2011), the Google Penguin Update (2012), and the Google Panda Update (2011) are just some examples of Google making ranking in organic listings HARDER – a change for ‘users’ that seems to have the most impact on ‘marketers’ outside of Google’s ecosystem.

A more recent example of ever-changing search would be the mobile-first index and the Core Web Vitals announcement:

QUOTE: “We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.” Sowmya Subramanian, Google 2020

Now, we need to be topic-focused (abstract, I know), instead of just keyword focused when optimising a web page for Google. There are now plenty of third-parties that help when researching keywords but most of us miss the kind of keyword intelligence we used to have access to.

The first step:

QUOTE: “is really to determine what it is you’re actually optimising for. This means identifying the terms people are searching for (also known as “keywords”) that you want your website to rank for in search engines like Google.” Wordstream, 2015

Proper keyword research is important because getting a site to the top of Google eventually comes down to your text content on a page and keywords in external & internal links. Altogether, Google uses these signals to determine where you rank if you rank at all.

There’s no magic bullet, to this.

QUOTE: “I don’t think there’s one magic trick that that I can offer you that will make sure that your website stays relevant in the ever-changing world of the web so that’s something where you’ll kind of have to monitor that on your side and work out what makes sense for your site or your users or your business.” John Mueller, Google 2019

At any one time, your site is probably feeling the influence of some algorithmic filter (for example, Google Panda or Google Penguin) designed to keep spam sites under control and deliver relevant, high-quality results to human visitors.

QUOTE: “Here’s how it works: Google (or any search engine you’re using) has a crawler that goes out and gathers information about all the content they can find on the Internet. The crawlers bring all those 1s and 0s back to the search engine to build an index. That index is then fed through an algorithm that tries to match all that data with your query.” Moz, 2020

One filter may be kicking in keeping a page down in the SERPs while another filter is pushing another page up. You might have poor content but excellent incoming links, or vice versa. You might have very good content, but a very poor technical organisation of it. You might have too many ads in the wrong places.

You must identify the reasons Google doesn’t ‘rate’ a particular page higher than competing pages

The answer is either on the page, on the site or in backlinks pointing to the page.

Ask yourself:

  • Do you have too few quality inbound links?
  • Do you have too many low-quality backlinks?
  • Does your page lack descriptive keyword rich text?
  • Are you keyword stuffing your text?
  • Do you link out to unrelated sites?
  • Do you have too many advertisements above the fold?
  • Do you have affiliate links on every page of your site and text found on other websites?
  • Do you have broken links and missing images on the page?
  • Does your page meet quality guidelines, legal and advertising stadards?
  • Does ads interrupt the enjoyment of your main content?

Whatever they are, identify issues and fix them.

Get on the wrong side of Google and your site might well be selected for MANUAL review – so optimise your site as if, one day, you will get that website review from a Google Web Spam reviewer.

QUOTE: “Put useful content on your page and keep it up to date.” Google Webmaster Guidelines, 2020

The key to a successful campaign, I think, is persuading Google that your page is most relevant to any given search query. You do this by good unique keyword rich text content and getting “quality” links to that page.

The latter is far easier to say these days than actually do!

Next time you are developing a page, consider what looks spammy to you is probably spammy to Google. Ask yourself which pages on your site are really necessary. Which links are necessary? Which pages on the site are emphasised in the site architecture? Which pages would you ignore?

You can help a site along in any number of ways (including making sure your page titles and meta tags are unique) but be careful. Obvious evidence of ‘rank modifying’ is dangerous.

I prefer simple techniques and ones that can be measured in some way. I have never just wanted to rank for competitive terms; I have always wanted to understand at least some of the reasons why a page ranked for these key phrases. I try to create a good user experience for humans AND search engines. If you make high-quality text content relevant and suitable for both these audiences, you’ll more than likely find success in organic listings and you might not ever need to get into the technical side of things, like redirects and search engine friendly URLs.

To beat the competition in an industry where it’s difficult to attract quality links, you have to get more “technical” sometimes – and in some industries – you’ve traditionally needed to be 100% black hat to even get in the top 100 results of competitive, transactional searches.

There are no hard and fast rules to long-term ranking success, other than developing quality websites with quality content and quality links pointing to it. The less domain authority you have, the more text you’re going to need. The aim is to build a satisfying website and build real authority!

You need to mix it up and learn from experience. Make mistakes and learn from them by observation. I’ve found getting penalised is a very good way to learn what not to do.

Remember there are exceptions to nearly every rule, and in an ever-fluctuating landscape, and you probably have little chance determining exactly why you rank in search engines these days. I’ve been doing it for over 15 years and every day I’m trying to better understand Google, to learn more and learn from others’ experiences.

It’s important not to obsess about granular ranking specifics that have little return on your investment unless you really have the time to do so! THERE IS USUALLY SOMETHING MORE VALUABLE TO SPEND THAT TIME ON.

That’s usually either good backlinks or great content.

QUOTE: “So there’s three things that you really want to do well if you want to be the world’s best search engine you want to crawl the web comprehensively and deeply you want to index those pages and then you want to rank or serve those pages and return the most relevant ones first….. we basically take PageRank as the primary determinant and the more PageRank you have that is the more people who link to you and the more reputable those people are the more likely it is we’re going to discover your page…. we use page rank as well as over 200 other factors in our rankings to try to say okay maybe this document is really authoritative it has a lot of reputation because it has a lot of PageRank … and that’s kind of the secret sauce trying to figure out a way to combine those 200 different ranking signals in order to find the most relevant document.” Matt Cutts, Google 2012

The fundamentals have not changed much over the years.

Google isn’t lying about rewarding legitimate effort – despite what some claim. If they were, I would be a black hat full time. So would everybody else trying to rank in Google.

It is much more complicated in some niches today than it was ten years ago.

The majority of small to medium businesses do not need advanced strategies because their direct competition has not employed these tactics either.

You are better off doing simple stuff better and faster than worrying about some of the more ‘advanced’ techniques you read on some blogs I think – it’s more productive, cost-effective for businesses and safer, for most.

Beware Pseudoscience In The Industry 

QUOTE: “Pseudoscience consists of statements, beliefs, or practices that are claimed to be both scientific and factual but are incompatible with the scientific method….” Wikipedia, 2020

Beware folk trying to bamboozle you with science.

This isn’t a science when Google controls the ‘laws’ and changes them at will.

QUOTE: “They follow the forms you gather data you do so and so and so forth but they don’t get any laws they don’t haven’t found out anything they haven’t got anywhere yet maybe someday they will but it’s not very well developed but what happens is an even more mundane level we get experts on everything that sound like this sort of scientific expert they they’re not scientist is a typewriter and they make up something.”  Richard Feynman, Physicist 1981

I get results by:

  • improving page experience signals
  • analysing Google rankings
  • performing Keyword research
  • making observations about ranking performance of your pages and that of others (though not in a controlled environment)
  • placing relevant, co-occurring words you want to rank for on pages
  • using synonyms
  • using words in anchor text in links on relevant pages and pointing them at relevant pages you want to rank high for the keyword in the anchor text
  • understanding what features in your title tag is what that page is going to rank best for
  • getting high-quality links from other trustworthy websites
  • publishing lots and lots of new, higher-quality content
  • focusing on the long tail of keyword searches
  • understanding it will take time to beat all this competition

I always expected to get a site demoted, by:

  • getting too many links with the same anchor text pointing to a page
  • keyword stuffing a page
  • trying to manipulate Google too much on a site
  • creating a “frustrating user experience.”
  • chasing the algorithm too much
  • getting links I shouldn’t have
  • buying links

Not that any of the above is automatically penalised all the time.

I was always of the mind I don’t need to understand the maths or science of Google, that much, to understand what Google engineers want.

The biggest challenge these days are to get trusted sites to link to you, but the rewards are worth it.

To do it, you probably should be investing in some marketable content, or compelling benefits for the linking party (that’s not just paying for links somebody else can pay more for). Buying links to improve rankings WORKS but it is probably THE most hated link building technique as far as the Google web spam team is concerned.

I was very curious about the science and  I studied what I could but it left me a little unsatisfied. I learned that building links, creating lots of decent content and learning how to monetise that content better (while not breaking any major TOS of Google) would have been a more worthwhile use of my time.

Getting better and faster at doing all that would be nice too.

There are many problems with blogs, too, including mine.

Misinformation is an obvious one. Rarely are your results conclusive or observations 100% accurate. Even if you think a theory holds water on some level. I try to update old posts with new information if I think the page is only valuable with accurate data.

Just remember most of what you read about how Google works from a third party is OPINION and just like in every other sphere of knowledge, ‘facts’ can change with a greater understanding over time or with a different perspective.

QUOTE: “Myths to be careful about..: 1. The NLP used by Google is not Neuro-linguistic Processing. 2. Semantic Search at Google is not powered by Latent Semantic Indexing 3. You cannot optimise pages for Hummingbird, RankBrain, or BERT.” Bill Slawski, 2019

Don’t Chase The Algorithm

QUOTE: “Google’s algorithm is constantly being improved; rather than trying to guess the algorithm and design your page for that, work on creating good, fresh content that users want, and following our guidelines.” Google Webmaster Guidelines, 2020

There is no magic bullet and there are no secret formulas to achieve fast number 1 ranking in Google in any competitive niche WITHOUT spamming Google.

A legitimately earned high position in search engines takes a lot of hard work.

There are a few less talked about tricks and tactics that are deployed by some (better than others) to combat algorithm changes, for instance, but there are no big secrets.

There are clever strategies, though, and creative solutions to be found to exploit opportunities uncovered by researching the niche.

Note that when Google recognises a new strategy that gets results the strategy itself usually becomes ‘webspam‘ and something you can be penalised for that so I would beware jumping on the latest fad.

The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn’t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. It is is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick.

After over a decade practising and deploying real campaigns, I’m still trying to get it down to its simplest, most cost-effective processes.

I think it’s about doing simple stuff right.

Good text, simple navigation structure, quality links. To be relevant and reputable takes time, effort and luck, just like anything else in the real world, and that is the way Google want it.

If a company is promising you guaranteed rankings and has a magic bullet strategy, watch out.

I’d check it didn’t contravene Google’s guidelines.

Keep It Simple, Stupid

Don’t Build Your Site With Flash, HTML Frames or any other deprecated technology.  Open web standards is the way forward.

QUOTE: “Don’t use frames to design your website” Shaun Anderson, Hobo 2020

Flash is a propriety plug-in created by Macromedia/Adobe to infuse (albeit) fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site. Instead, build your site with CSS and HTML ensuring everyone, including search engine robots, can sample your website content. Then, if required, you can embed media files such as Flash in the HTML of your website.

QUOTE: “Chrome will continue phasing out Flash over the next few years, first by asking for your permission to run Flash in more situations, and eventually disabling it by default. We will remove Flash completely from Chrome toward the end of 2020.” Google Chrome, 2017

Flash, in the hands of an inexperienced designer, can cause all types of problems at the moment, especially with:

  • Accessibility
  • Search Engines
  • Users not having the Plug-In
  • Large Download Times

Flash doesn’t even work at all on some devices, like the Apple iPhone. Note that Google sometimes highlights if your site is not mobile friendly on some devices. And on the subject of mobile-friendly websites – note that Google has alerted the webmaster community that mobile friendliness will be a search engine ranking factor.

Html5 is the preferred option over Flash these days, for most designers. A site built entirely in Flash will cause an unsatisfactory user experience and will affect your rankings and especially in mobile search results. For similar accessibility and user satisfaction reasons, I would also say don’t build a site with website frames.

As in any form of design, don’t try and re-invent the wheel when simple solutions suffice. The KISS philosophy has been around since the dawn of design.

KISS does not mean boring web pages. You can create stunning sites with smashing graphics – but you should build these sites using simple techniques – HTML & CSS, for instance. If you are new to web design, avoid things like Flash and JavaScript, especially for elements like scrolling news tickers, etc. These elements work fine for TV – but only cause problems for website visitors.

Keep layouts and navigation arrays consistent and simple too. Don’t spend time, effort and money (especially if you work in a professional environment) designing fancy navigation menus if, for example, your new website is an information site.

Same with website optimisation – keep your documents well structured and keep your page Title Elements and text content relevant, use Headings tags sensibly and try and avoid leaving too much of a footprint – whatever you are up to.

How Long Does It Take To See Results from SEO?

We have been told it:

QUOTE: “will need four months to a year to help your business first implement improvements and then see potential benefit.” Maile Ohye, Google 2017

Some results can be gained within weeks and you need to expect some strategies to take months to see the benefit. Google WANTS these efforts to take time. Critics of the search engine giant would point to Google wanting fast effective rankings to be a feature of Googles own Adwords sponsored listings.

If you are recovering from previous low-quality activity, it is going to take much longer to the see the benefits.

QUOTE: “Even if you make big changes with the design and the functionality, and you add new features and things, I would definitely expect that to take multiple months, maybe half a year, maybe longer, for that to be reflected in search because it is something that really needs to be re-evaluated by the systems overall.. Low-quality pages tend to be recrawled much less frequently by Googlebot – it is not unusual for six months to go by between Googlebot crawls on low quality pages or sections of a site. So it can take an incredibly long time for Google to recrawl and then reevaluate those pages for ranking purposes.” John Mueller, Google, 2020

SEO is not a quick process, and a successful campaign can be judged on months if not years. Most techniques that inflate rankings successfully end up finding their way into Google Webmaster Guidelines – so be wary.

It takes time to build quality, and it’s this quality that Google aims to reward, especially we are told during Core Updates:

QUOTE: “There’s nothing wrong with pages that may perform less well in a core update. They haven’t violated our webmaster guidelines nor been subjected to a manual or algorithmic action, as can happen to pages that do violate those guidelines. In fact, there’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.”  Danny Sullivan, Google 2020

It takes time to generate the data needed to begin to formulate a campaign, and time to deploy that campaign. Progress also depends on many factors

  • How old is your site compared to the top 10 sites?
  • How many back-links do you have compared to them?
  • How is their quality of back-links compared to yours?
  • What the history of people linking to you (what words have people been using to link to your site?)
  • How good of a resource is your site?
  • Can your site attract natural back-links (e.g. you have good content or known for what you do) or are you 100% relying on a third-party for back-links (which is very risky)?
  • How much unique content do you have?
  • Do you have to pay everyone to link to you (which is risky), or do you have a “natural” reason people might link to you?

Google wants to return quality pages in its organic listings, and it takes time to build this quality and for that quality to be recognised.

It takes time too to balance your content, generate quality backlinks and manage your disavowed links.

Google knows how valuable organic traffic is – and they want webmasters investing a LOT of effort in ranking pages.

Critics will point out the higher the cost, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.

If you start NOW and are determined to build an online brand, a website rich in content with a satisfying user experience  – Google will reward you in organic listings.

The truth is, it’s bound to take maybe a year or two to achieve a dominant position in a very competitive niche. That’s also assuming you are fixing quality issues, improving content quality and improving page experience from the get-go.

ROI

SEO is a marketing activity just like any other and there are no guarantees of success in any, for what should be obvious reasons. There are no guarantees in Google Adwords either, except that costs to compete will go up, of course.

That’s why it is so attractive – but like all marketing – it is still a gamble.

At the moment, I don’t know you, your business, your website, your resources, your competition or your product. Even with all that knowledge, calculating ROI is extremely difficult because ultimately Google decides on who ranks where in its results – sometimes that’s ranking better sites, and sometimes (often) it is ranking sites breaking the rules above yours.

Nothing is absolute in search marketing.

There are no guarantees – despite claims from some companies. What you make from this investment is dependent on many things, not least, how suited your website is to convert visitors into sales.

Every site is different.

Big Brand campaigns are far, far different from small business campaigns that don’t have any links, to begin with, to give you but one example.

It’s certainly easier if the brand in question has a lot of domain authority just waiting to unlocked – but of course, that’s a generalisation as big brands have big brand competition too.

It depends entirely on the quality of the site in question and the level and quality of the competition, but smaller businesses should probably look to own their niche, even if limited to their location, at first.

Local is always a good place to start for small businesses.

A Real Google Friendly Website

At one time a Google-friendly website meant a website built so Googlebot could scrape it correctly and rank it accordingly.

When I think ‘Google-friendly’ these days – I think a website Google will rank top, if popular and accessible enough, and won’t drop like a f*&^ing stone for no apparent reason one day, even though I followed the Google starter guide to the letter….. just because Google has found something it doesn’t like – or has classified my site as undesirable one day.

It is not JUST about original content anymore – it’s about the function your site provides to Google’s visitors – and it’s about your commercial intent.

I am building sites at the moment with the following in mind…..

  • Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.
  • Think, that one day, your website will have to pass a manual review by ‘Google’ – the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google, at least classes even useful sites as spammy, according to leaked documents. If you want a site to rank high in Google – it better ‘do’ something other than exist only link to another site because of a paid commission. Know that to succeed, your website needs to be USEFUL, to a visitor that Google will send you – and a useful website is not just a website, with a sole commercial intent, of sending a visitor from Google to another site – or a ‘thin affiliate’ as Google CLASSIFIES it.
  • Think about how Google can algorithmically and manually determine the commercial intent of your website – think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance; or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster’s particular commercial intent – hence why Google has a Top Heavy Algorithm.
  • Google is NOT going to thank you for publishing lots of similar articles and near-duplicate content on your site – so EXPECT to have to create original content for every page you want to perform in Google, or at least, not publish content found on other sites.
  • Ensure Google knows your website is the origin of any content you produce (typically by simply pinging Google via XML or RSS)
  • Understand and accept why Google ranks your competition above you – they are either: more relevant and more popular, more relevant and more reputable, better user experience or manipulating backlinks better than you. Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete – relying on Google to take action on your behalf is VERY probably not going to happen.
  • Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
  • The basics havent changed for years – though effectiveness of particular elements has certainly narrowed or changed in type of usefulness – you should still be focusing on building a simple site using VERY simple best practices – don’t sweat the small stuff, while all-the-time paying attention to the important stuff  – add plenty of unique PAGE TITLES and plenty of new ORIGINAL CONTENT. Understand how Google SEES your website. CRAWL it, like Google does, with (for example) Screaming Frog spider, and fix malformed links or things that result in server errors (500), broken links (400+) and unnecessary redirects (300+). Each page you want in Google should serve a 200 OK header message.
  • Use common sense – Google is a search engine – it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content-rich websites get a lot of links – especially quality links. Google ranks websites with a lot of links (especially quality links) at the top of its search engines so the obvious thing you need to do is ADD A LOT of INFORMATIVE CONTENT TO YOUR WEBSITE.
  • I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site; some pages cannot. Some links can. Some cannot. Some links are trusted enough to pass ranking signals to another page. Some are not. YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES & FILTERS.
  • Google engineers are building an AI – but it’s all based on simple human desires to make something happen or indeed to prevent something. You can work with Google engineers or against them. Engineers need to make money for Google but unfortunately for them, they need to make the best search engine in the world for us humans as part of the deal. Build a site that takes advantage of this. What is a Google engineer trying to do with an algorithm? I always remember it was an idea first before it was an algorithm. What was that idea? Think “like” a Google search engineer when making a website and give Google what it wants. What is Google trying to give its users? Align with that. What does Google not want to give its users? Don’t look anything like thatTHINK LIKE A GOOGLE ENGINEER & BUILD A SITE THEY WANT TO GIVE TOP RANKINGS.
  • Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
  • Google has a long memory when it comes to links and pages and associations for your site. It can forgive but won’t forget. WHAT RELATIONSHIP DO YOU WANT TO HAVE WITH GOOGLE? Onsite, don’t try to fool Google. Be squeaky clean on-site and have Google think twice about demoting you for the odd discrepancies in your link profile.
  • Earn Google’s trust.
  • Be fast.

This is a complex topic, as I said at the beginning of this in-depth article.

Important Google Webmaster Guidelines To Know!

 RankGoogle Guideline or Support DocumentsSource
1
Guidance on building high-quality websitesView
2
Main webmaster guidelinesView
3
Quality Rater’s Guide (and previous years!)View
4
Link Schemes WarningView
5
Disavow Backlinks WarningView
6
Auto-Generated Content WarningView
7
Affiliate Programs AdviceView
8
Report spam, paid links, or malwareView
9
Reconsideration requestsView
10
List of common manual actionsView
11
Use rel=”nofollow” for specific linksView
12
Adding A Site To GoogleView
13
Browser Compatibility AdviceView
14
URL Structure AdviceView
15
Learn about sitemapsView
16
Duplicate ContentView
17
Use canonical URLSView
18
Indicate paginated contentView
19
Change page URLs with 301 redirectsView
20
How Google Deals With AJAXView
21
Review your page titles and snippetsView
22
Meta tags that Google understandsView
23
Image Publishing GuidelinesView
24
Video best practicesView
25
Flash and other rich media filesView
26
Learn about robots.txt filesView
27
Create useful 404 pagesView
28
Introduction to Structured DataView
29
Mark Up Your Content ItemsView
30
Schema GuidlinesView
31
Keyword Stuffing WarningsView
32
Cloaking WarningView
33
Sneaky Redirects WarningView
34
Hidden Text & Links WarningsView
35
Doorway Pages WarningsView
36
Scraped Content WarningsView
37
Malicious Behaviour WarningsView
38
Hacking WarningsView
39
Switching to HttpsView
40
User Generated Spam WarningsView
41
Social EngineeringView
42
Malware and unwanted softwareView
43
Developing Mobile SitesView
44
Sneaky mobile redirectsView
45
Developing mobile-friendly pagesView
46
Use HTTP “Accept” header for mobileView
47
Feature phone sitemapsView
48
Multi-regional and multilingual sitesView
49
Use hreflang for language and regional URLsView
50
Use a sitemap to indicate alternate languageView
51
Locale-aware crawling by GooglebotView
52
Remove information from GoogleView
53
Move your site (no URL changes)View
54
Move your site (URL changes)View
55
How Google crawls, and serves resultsView
56
Ranking In GoogleView
57
Search Engine OptimisationView
58
Steps to a Google-friendly siteView
59
Webmaster FAQView
60
Check your site’s search performanceView

Did I miss any? Let me know.

Submit Your Website To Google Search Console

You won’t be able to operate effectively for a legitimate business website without verifying the website with Google Search Console.

Fortunately, Google has now made this simple:

QUOTE: “You can submit your website and verify it in Google Search Console. The procedure to connect your website is very simple with a little technical knowledge.” Shaun Anderson, Hobo 2020

Free Ebook

I publish all my tips in a free ebook for my subscribers:

QUOTE: “You can download my free Ebook….” Shaun Anderson, Hobo 2020

Disclaimer

Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites.. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content” Shaun Anderson, Hobo