Professional Search Engine Optimisation (SEO) Tutorial – A Beginner’s Guide

Blog subscribers

Disclosure: “This article is personal opinion of research based on my experience of 20 years. There is no third party advertising on this page or monetised links of any kind. Any external links to third party sites are moderated by me. Disclaimer.” Shaun Anderson, Hobo

Learn how to SEO a website:

  1. Only add high-quality 100% unique content to your site
  2. Create pages where the main purpose of the page is given priority
  3. Avoid annoying ads and pop-ups (especially on mobile)
  4. Register your website with Google Search Console
  5. Optimise your website core web vitals
  6. Have a responsive design
  7. Have a mobile theme that downloads under 3 seconds
  8. Don’t block search engines crawling your site in robots.txt or metatags
  9. Don’t confuse or annoy a website visitor
  10. Don’t block Google from crawling resources on your site or rendering specific elements on your page
  11. Optimise for customers local to your business, if that is important
  12. Geotarget your site in Search Console (you do not need to do this you have a country-specific domain)
  13. Do not use copied content for pages (or even part of a page) – pages should stand on their own
  14. Use a simple navigation system on your site
  15. Develop websites that meet Google technical recommendations on (for example) canonicalisation, internationalisation and pagination best practices
  16. Ensure on average all ‘Main Content’ blocks of all pages on the site are high-quality
  17. Ensure old-school practices are cleaned up and removed from site
  18. Avoid implementing old-school practices in new campaigns (Google is better at detecting sites with little value-add)
  19. Consider disavowing any obvious low-quality links from previous link building activity
  20. Provide Clear website domain ownership, copyright and contact details on the site
  21. Share your content on the major social networks when it is good enough
  22. Get backlinks from real websites with real domain trust and authority
  23. Do not build unnatural links to your site
  24. Convert visitors (whatever that ‘conversion’ is)
  25. Monitor VERY CAREFULLY any user-generated content on your site, because it is rated as part of your own site content
  26. Pay attention to site security issues (implement secure https, for example)
  27. Beware evaporating efforts (AVOID old-school practices designed to manipulate rankings and REMOVE them when identified)
  28. Ensure erroneous conversion rate optimisation (CRO) practices are not negatively impacting organic traffic levels (don’t screw this part up!)
  29. Aim for a good ratio of ‘useful’ user-centered text to affiliate links. Be careful using affiliate links at all!
  30. Make visitors from Google happy
  31. On-Page, include your keyword phrase at least once in the Page Title Element
  32. On-Page, keep page titles succinct and avoid adding irrelevant keywords
  33. On-Page, include your keyword phrase at least once in the Main Content on the page (at least once in page copy (in Paragraph tags)
  34. On-Page, avoid keyword stuffing main content or any specific html element or attribute
  35. On-Page, optimise your meta description to have a clickable useful SERP snippet
  36. On-Page, ensure the Main Content of the page is high-quality and written by a professional (MOST OF YOUR EFFORT GOES HERE – If your content is not being shared organically, you may have a content quality problem)
  37. On-Page, Ensure the keywords you want to rank for are present on your page. The quality of competition for these rankings will determine how much effort you need to put in
  38. On-Page, use synonyms and common co-occurring words throughout your page copy but write naturally
  39. On-Page, add value to pages with ordered lists, images, videos and tables
  40. On-Page, optimise for increased ‘user intent’ satisfaction (e.g. increased dwell times on a page or site)
  41. On-Page, keep important content on the site updated a few times a year
  42. On-Page, trim outdated content
  43. On-Page, avoid publishing and indexing content-poor pages (especially affiliate sites)
  44. On-Page, disclose page modification dates in a visible format
  45. On-Page, do not push the main content down a page unnecessarily with ads or obtrusive CTA for even your own business
  46. On-Page, link to related content on your site with useful and very relevant anchor text
  47. On-Page, avoid keyword stuffing internal anchor text.
  48. On-Page, create pages to basic meet W3C recommendations on accessible HTML (W3c) (H1, ALT text etc)
  49. On-Page, create pages to meet basic usability best practices (Nielsen) – Pay attention to what ‘annoys’ website visitors
  50. On-Page, ensure fast delivery of web pages on mobile and desktop
  51. On-Page, provide clear disclosure of affiliate ads and non-intrusive advertising. Clear disclosure of everything, in fact, if you are focused on quality in all areas.
  52. On-Page, add high-quality and relevant external links (depending if the query is informational)
  53. On-Page, if you can, include the Keyword phrase in a short URL
  54. On-Page, use the Keyword phrase in internal anchor text pointing to this page (at least once)
  55. On-Page, use Headings, Lists and HTML Tables on pages to display appropriate data

What is SEO?

Search engine optimisation (SEO) is a technical, analytical and creative process to improve the visibility of a website in search engines.  In simple terms this is all about getting free traffic from Google, the most popular search engine in the world.

The tips I present on this page will help you create a successful search engine-friendly website yourself.

I have 20 years of experience as a professional marketer. This SEO tutorial is a collection of tips and best practices I use (and have used for years) to rank websites in Google (and Bing).

If you find it of use, please share it.

How To Learn SEO

The best way to learn is to practice it on a real website. You keep up with the latest industry news and follow search engine webmaster guidelines. You edit a website that ranks in search engines and watch how search engines respond to your changes. You monitor organic search engine traffic. Track rankings for individual keywords and pages. You do lots of tests.

QUOTE: “There aren’t any quick magical tricks…. so that your site ranks number one. It’s important to note that any …. potential is only as high as the quality of your business or website so successful SEO helps your website put your best foot forward.” Maile Ohye, Google 2017

What really matters is what you prioritise today so that in 3-6 months you can see improvements in the quality of your organic traffic, as we:

QUOTE: “will need four months to a year to help your business first implement improvements and then see potential benefit.” Maile Ohye, Google 2017

You will need to meet Google’s guidelines and recommendations in every area (and, if you are like me with this site, you eventually avoid bending any rule and just focus on serving the user useful and up-to-date content).

QUOTE: “I don’t think there’s one magic trick that that I can offer you that will make sure that your website stays relevant in the ever-changing world of the web so that’s something where you’ll kind of have to monitor that on your side and work out what makes sense for your site or your users or your business.” John Mueller, Google 2019

Introduction

Search engine optimisation is:

QUOTE: “often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but…. they could have a noticeable impact on your site’s user experience and performance in organic search results.” Google Starter Guide, 2020

This article is a beginner’s guide.

I deliberately steer clear of techniques that might be ‘grey hat’, as what is grey today is often ‘black hat’ tomorrow, or ‘shady practices’, as far as Google is concerned.

QUOTE: “Shady practices on your website […] result in a reduction in search rankings” Maile Ohye, Google 2017

No one-page guide can explore this complex topic in full. What you’ll read here are answers to questions I had when I was starting out in this field 20 years ago, now ‘corroborated with confirmations from Google.

QUOTE: “My strongest advice ….. is to request if they corroborate their recommendation with a documented statement from Google” Maile Ohye, Google 2017

You will find I do.

The ‘Rules.’

Google insists webmasters adhere to their ‘rules’ and aims to reward sites with high-quality content and remarkable ‘white hat’ web marketing techniques with high rankings.

QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors discussed here.” Google SEO Starter Guide, 2020

Conversely, it also needs to filter or penalise websites that manage to rank in Google by breaking these rules.

QUOTE: “Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.” Gary Illyes, Google 2016

These rules are not ‘laws’, but ‘guidelines’, for ranking in Google; lay down by Google. You should note, however, that some methods of ranking in Google are, in fact, illegal. Hacking, for instance, is illegal in many countries.

You can choose to follow and abide by these rules, bend them or ignore them – all with different levels of success (and levels of retribution, from Google’s web spam team).

White hats do it by the ‘rules’; black hats ignore the ‘rules’.

What you read in this article is perfectly within the laws and also within the guidelines and will help you increase the traffic to your website through organic, or natural search engine results pages (SERPs).

Search Engine Market Share Worldwide

Opportunity

QUOTE: “It is the process of getting traffic from the “free,” “organic,” “editorial” or “natural” search results on search engines.” Search Engine Land, 2020

A professional, whether they practice it in India, Asia, the Middle East or Europe has an understanding of how search engine users search for things and an understanding what type of results Google wants to (or will) display to its users and under which conditions.

An SEO is:

QUOTE: “is someone trained to improve your visibility on search engines.” Google Webmaster Guidelines, 2020

A professional has an understanding of how search engines like Google generate their natural SERPs to satisfy users’ navigationalinformational and transactional keyword queries.

QUOTE: “One piece of advice I tend to give people is to aim for a niche within your niche where you can be the best by a long stretch. Find something where people explicitly seek YOU out, not just “cheap X” (where even if you rank, chances are they’ll click around to other sites anyway).” John Mueller, Google 2018

Risk Management

QUOTE: “Google doesn’t accept payment to crawl a site more frequently, or rank it higher. If anyone tells you otherwise, they’re wrong.” Google Webmaster Guidelines, 2020

A good search engine marketer has a good understanding of the short term and long term risks involved in optimising rankings in search engines, and an understanding of the type of content and sites Google (especially) WANTS to return in its natural SERPs.

The aim of any campaign is more visibility in search engines and this would be a simple process if it were not for the many pitfalls.

There are rules to be followed or ignored, risks to take, gains to make, and battles to be won or lost.

Free Traffic

QUOTE: “Google is “the biggest kingmaker on this Earth.” Amit Singhal, Google, 2010

A Mountain View spokesman once called the search engine ‘kingmakers‘, and that’s no lie.

Ranking high in Google is VERY VALUABLE – it’s effectively ‘free advertising’ on the best advertising space in the world.

Traffic from Google natural listings is STILL the most valuable organic traffic to a website in the world, and it can make or break an online business.

The state of play is that you can STILL generate highly targeted leads, for FREE, just by improving your website and optimising your content to be as relevant as possible for a buyer looking for your company, product or service.

As you can imagine, there’s a LOT of competition now for that free traffic – even from Google (!) in some niches.

You shouldn’t compete with Google.

You should focus on competing with your competitors.

The Process

Google supports a technical approach and supports us when we:

QUOTE: “provide useful services for website owners, including: Review of your site content or structure – Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript – Content development – Management of online business development campaigns – Keyword research – …training – Expertise in specific markets and geographies.” Google Webmaster Guidelines, 2020

The process can be practiced, successfully, in a bedroom or a workplace, but it has traditionally always involved mastering many skills as they arose including diverse marketing technologies including but not limited to:

  • Website design
  • Accessibility
  • Usability
  • User experience
  • Website development
  • PHP, HTML, CSS, etc.
  • Server management
  • Domain management
  • Copywriting
  • Spreadsheets
  • Backlink analysis
  • Keyword research
  • Social media promotion
  • Software development
  • Analytics and data analysis
  • Information architecture
  • Research
  • Log Analysis
  • Looking at Google for hours on end

It takes a lot to rank on merit a page in Google in competitive niches, due to the amount of competition for those top spots.

Pagerank

Pagerank is only one of hundreds of signals Google uses to rank web pages:

QUOTE: “Google probably uses an updated version of Pagerank like the Reasonable Surfer and Trusted Seeds approach. Pagerank is a measure of the popularity of a resource on the web. The simple, most practical answer to increasing Google Pagerank is to get prominent, natural links on authoritative websites in your niche.” Shaun Anderson, Hobo 2020

Evaporating SEO Efforts

Google ignores lots of your lower-quality efforts, and as Google evolves and makes very slight changes to its algorithms to further protect its systems, everybody’s rankings fluctuate.

When you spam a page or element of a page, you can expect the benefit you received from it to be at some point negated (if not penalised).

There are many examples of this… for instance, with Keyword stuffing:

QUOTE: “Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started.” Gary Illyes, Google 2017

and low-quality link building:

QUOTE: “(Penguin) doesn’t demote it will just discard the incoming spam toward the side and it will just ignore the spam and that’s it no penalty no demotion and it works in real time” Gary Illyes, Google 2016

and when we started manipulating pagerank sculpting, Google changed the way Pagerank flowed:

QUOTE: “What happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each.” Matt Cutts, Google 2009

I also know of a few evaporating black spots in on page elements, too.

For instance, in a recent test if you have a page title tag longer than 12 words all the keywords beyond 12 words carried less weight than when they were on the visible page text.

Image ALT text too has a blackspot. Add more than 16 words to ALT text and the remaining words… evaporate. These keywords don’t count, at all.

Anchor text too, has a limit of 16 words and a black spot for keywords contained beyond the 16th word.

Google has limits. Expect them to change, or be modified.

The more you spam elements of your page or link profile, the more Google’s algorithms go to work to “disregard the advantage you figure out, so you fall back where you started“. The more you take optimisation to the nth degree, the more susceptible you are to algorithm changes when things change.

If you are consistently and successfully abusing the system, and are caught, Google algorithms will flag your site for manual review, and a potential penalty called a “manual action”, where you will need to clean up the offending issues before you can rank high again.

But mostly, Google prefers to just ignore and devalue your low-quality efforts, as long as they are not too malicious.

You do not want to “chase the Google algorithm”.

Focus on making something useful that attracts high quality links to it instead.

This is pure speculation, but:

QUOTE: The manual actions team… can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labelled, then they might actually take a deeper look and see what the content owner is trying to do.” Gary Illyes, Google 2016

This quote reminded me of when back in 2011 on a blackhat forum the GSA (Google Search Appliance) was discussed, someone got their hands on one and some attempted to hack Google’s code.

What they reported back was…

QUOTE:  “tons of link labels”

e.g.

_LinkTags_NAMES = {
    0: "PAGE_GUESTBOOK",
    16: "PAGE_FORUM",
    2: "PAGE_MAILING_LIST",
    3: "PAGE_BLOG_COMMENTS",
    4: "PAGE_BLOG",
    5: "PAGE_PPC",
    19: "PAGE_SPAM_SIGNATURE_KILL",
    21: "PAGE_SPAM_SIGNATURE_NOPROP",
    7: "PAGE_AFFILIATE",

It makes sense that this process of labeling is how you create a Search Engine Results Page out of pages Pagerank (or something akin to it) identifies, identify spam, identify monetisation trends and promote content first pages and user friendly content above others. You can also imagine that over time, Google should get a lot better at working out quality SERPs for its users, as it identifies more and more NEGATIVE ranking signals, thereby floating higher quality pages to the top as a second order effect. A end-result of this could be that Google gets an amazing SERP for its users.

Google might have labels for EVERYTHING from low quality, spam, intent, site speed, ad placement, ymyl page, information page etc and as they build out scoring principles and sub-labels for each label.

In one example, they might decide not to show any affiliate sites on page 1, for instance, or only 1.

All the above is speculation, of course.

What can be agreed upon is that we do not want a LOW QUALITY label on our pages OR links, either appointed by algorithm OR manual rater.

At least one former spam fighter that would confirm that at least some people are employed at Google to demote sites that fail to meet policy:

QUOTE: “I didn’t SEO at all, when I was at Google. I wasn’t trying to make a site much better but i was trying to find sites that were not ‘implementing Google policies'(?*) and not giving the best user experience.” Murat Yatağan, Former Google Webspam team, 2016

I think I see more of Google pulling pages and sites down the rankings (because of policy violation) than promoting them because of discovered ‘quality’. Perhaps the human quality raters are there to highlight quality. We don’t know what Google does with all this information.

I proceed thinking that in Google’s world, a site that avoids punishment algorithms, has verified independent links and has content favoured by users over time (which they are tracking) is a ‘quality page’ Google will rank highly.

So, for the long term, on primary sites, once you have cleaned all infractions up, the aim is to satisfy users by:

  • getting the click
  • keeping people on your site by delivering on purpose and long enough for them to terminate their search (without any techniques that would be frowned upon or contravene Google recommendations)
  • convert visitors to at least search terminators, returning visitors or actual sales

What Is A Successful Strategy?

Get relevant. Get trusted. Get Popular. Help a visitor complete their task. Do not annoy users. Do not put CRO before a user’s enjoyment of content e.g. do not interrupt MC  (Main Content) of a page with ADs (Adverts) or CTA (Adverts for your own business).

It is no longer just about manipulation.

Success comes from adding quality and often useful content to your website that together meet a PURPOSE that delivers USER SATISFACTION over the longer term.

If you are serious about getting more free traffic from search engines, get ready to invest time and effort in your website and online marketing.

Be ready to put Google’s users, and yours, FIRST, before Conversion, especially on information-type pages (articles, blog posts etc).

I call this a Content First Strategy.

In short, it was too easy (for some) to manipulate Google’s rankings at the beginning of the decade. If you had enough ‘domain authority‘ you could use ‘thin content‘ to rank for anything. This is the definition of a ‘content farm‘.

Web spammers often used ‘unnatural‘ backlinks to build fake ‘domain authority‘ to rank this ‘thin‘ content.

I know I did, in the past.

Google Continually Raises The Bar For Us All To Meet

QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” Paul Haahr, Google 2016

Google decided to rank HIGH-QUALITY documents in its results and force those who wish to rank high to invest in higher-quality content or a great customer experience that creates buzz and attracts editorial links from reputable websites.

These high-quality signals are in some way based on Google being able to detect a certain amount of attention and effort put into your site and Google monitoring over time how users interact with your site.

These type of quality signals are much harder to game than they were in 2011, for instance.

Essentially, the ‘agreement’ with Google is if you’re willing to add a lot of great content to your website and create a buzz about your company, Google will rank you high above others who do not invest in this endeavour.

QUOTE: “high quality content is something I’d focus on. I see lots and lots of ….blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015

If you try to manipulate Google, it will penalise you for a period, and often until you fix the offending issue – which we know can LAST YEARS.

If you are a real business who intends to build a brand online and rely on organic traffic – you can’t use black hat methods. Full stop. It can take a LONG time for a site to recover from using black hat tactics and fixing the problems will not necessarily bring organic traffic back as it was before a penalty.

QUOTE: “Cleaning up these kinds of link issue can take considerable time to be reflected by our algorithms (we don’t have a specific time in mind, but the mentioned 6-12 months is probably on the safe side). In general, you won’t see a jump up in rankings afterwards because our algorithms attempt to ignore the links already, but it makes it easier for us to trust the site later on.” John Mueller, Google, 2018

Recovery from a Google penalty is a ‘new growth’ process as much as it is a ‘clean-up’ process.

Google Rankings Always Change

Google Rankings Are In Constant Ever-Flux

QUOTE: “Things can always change in search.” John Mueller, Google 2017

Graph: Typical Google rankings 'ever flux'

Put simply – It’s Google’s job to MAKE MANIPULATING SERPs HARD. Its HARD to get to number 1 in Google for competitive keyword phrases.

So – the people behind the algorithms keep ‘moving the goalposts’, modifying the ‘rules’ and raising ‘quality standards’ for pages that compete for top ten rankings.

We have constant ever-flux in the SERPs – and that seems to suit Google and keep everybody guessing.

Fluctuating Rankings

QUOTE: “nobody will always see ranking number three for your website for those queries. It’s always fluctuating.” John Mueller, Google, 2015

This flux is not necessarily something to do with a problem per se and 

QUOTE: “fluctuations in search are normal and a sign that our algorithms & engineers are working hard.” John Mueller, Google 2016

Fluctuating upwards could be a good sign as he mentioned: “maybe this is really relevant for the first page, or maybe not.” – then again – the converse is true, one would expect.

 He says “a little bit of a push to make your website a little bit better, so that the algorithms can clearly say, yes, this really belongs on the first page.” which I thought was an interesting turn of phrase.  ‘First page’, rather than ‘number 1’.

Google is very secretive about its ‘secret sauce’ and offers sometimes helpful and sometimes vague advice – and some say offers misdirection – about how to get more from valuable traffic from Google.

Google is on record as saying the engine is intent on ‘frustrating’ attempts to improve the amount of high-quality traffic to a website – at least (but not limited to) – using low-quality strategies classed as web spam.

QUOTE: “If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.” Matt Cutts, Google 2013

At its core, Google search engine optimisation is still about keywords and links. It’s about relevancereputation and trust. It is about quality and intent of pagesvisitor satisfaction & increased user engagement. It is about users seeking your website out and completing the task they have to complete.

Good overall user experience is a key to winning – and keeping – the highest rankings in many verticals.

Relevance, Authority & Trust

QUOTE: “Know that ‘content’ and relevance’ are still primary.” Maile Ohye, Google 2010

Web page optimisation is STILL about making a web page relevant and trusted enough to rank for any given search query.

It’s about ranking for valuable keywords for the long term, on merit. You can play by ‘white hat’ rules lay down by Google, and aim to build this Authority and Trust naturally, over time, or you can choose to ignore the rules and go full time ‘black hat’.

MOST tactics still work, for some time, on some level, depending on who’s doing them, and how the campaign is deployed.

Whichever route you take, know that if Google catches you trying to modify your rank using overtly obvious and manipulative methods, then they will class you a web spammer, and your site will be penalised ( you will not rank high for relevant keywords).

QUOTE: “Those practices, referred to in the patent as “rank-modifying spamming techniques,” may involve techniques such as: Keyword stuffing, Invisible text, Tiny text, Page redirects, Meta tags stuffing, and Link-based manipulation.” Bill Slawski, 2012

These penalties can last years if not addressed, as some penalties expire and some do not – and Google wants you to clean up any violations – even historic violations.

QUOTE: “If parts of your website don’t comply with our webmaster guidelines, and you want to comply with our webmaster guidelines, then it doesn’t matter how old those non-compliant parts are.” John Mueller, Google 2020

Google does not want you to try and modify where you rank, easily. Critics would say Google would prefer you paid them to do that using Google Adwords.

The problem for Google is – ranking high in Google organic listings is a real social proof for a business, a way to avoid PPC costs and still, simply, the BEST WAY to drive VALUABLE traffic to a site.

It’s FREE, too, once you’ve met the always-increasing criteria it takes to rank top.

UX; ‘User Experience‘ Does Matter

Is User Experience A Ranking Factor?

User experience is mentioned many times in the Quality Raters Guidelines but we have been told by Google it is not, per say, a classifiable ‘ranking factor‘ on desktop search, at least.

QUOTE: “On mobile, sure, since UX is the base of the mobile friendly update. On desktop currently no. (Gary Illyes: Google, May 2015)

While UX (User Experience), we are told, is not literally a ‘ranking factor’, it is useful to understand exactly what Google calls a ‘poor user experience‘ because if any poor UX signals are identified on your website, that is not going to be a healthy thing for your rankings anytime soon.

Matt Cutts (no longer with Google) consistent advice was to focus on a very satisfying user experience.

What is Bad UX?

For Google – rating UX, at least from a quality rater’s perspective, revolves around marking the page down for:

  • misleading or potentially deceptive design
  • sneaky redirects
  • malicious downloads and
  • spammy user-generated content (unmoderated comments and posts)
  • low-quality MC (main content of the page)
  • low-quality or distracting SC (supplementary content)

In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.

Google is still, evidently, more interested in rating the main content of the webpage in question and the reputation of the domain the page is on – relative to your site, and competing pages on other domains.

A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school stuff like keyword stuffing a site.

If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school techniques – those certainly are positive steps to getting more traffic from Google – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.

When I hear Google talking about user experience, I often wonder…. is Google talking about AD EXPERIENCE?

In some cases, I think so!

QUOTE: “At Google we are aiming to provide a great user experience on any device, we’re making a big push to ensure the search results we deliver reflect this principle.” Google 2014

There is no single ‘user experience‘ ranking factor, we have been told, however poor user experience clearly does not lead to high rankings in Google.

QUOTE: “I don’t think we even see what people are doing on your website if they’re filling out forms or not if they’re converting to actually buying something so if we can’t really see that then that’s not something that we’d be able to take into account anyway. So from my point of view that’s not something I’d really treat as a ranking factor. Of course if people are going to your website and they’re filling out forms or signing up for your service or for a newsletter then generally that’s a sign that you’re doing the right things.”. John Mueller, Google 2015

Note what Google labels ‘user experience‘ may differ from how others define it.

Take for instance a slow page load time, which is a poor user experience:

QUOTE: “We do say we have a small factor in there for pages that are really slow to load where we take that into account.” John Mueller,Google, 2015

or sites that don’t have much content “above-the-fold”:

QUOTE: “So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.” Google 2012

These are user experience issues Google penalises for, but as another Google spokesperson pointed out:

QUOTE: “Rankings is a nuanced process and there is over 200 signals.” Maile Ohye, Google 2010

If you expect to rank in Google organic listings you’d better have a quality offering, not based entirely on manipulation, or old school tactics.

Is a visit to your site a good user experience? Is it performing its task better than the competition?

If not – beware the manual ‘Quality Raters’ and beware the Google Panda/Site Quality algorithms that are looking to demote sites based on what we can easily point to as poor user experience signals and unsatisfying content when it is presented to Google’s users.

Google raising the ‘quality bar’, year on year, ensures a higher level of quality in online marketing in general (above the very low-quality we’ve seen over the last years).

Success in organic marketing involves investment in higher quality on-page content, better website architecture, improved usability, intelligent conversion to optimisation balance, and ‘legitimate’ internet marketing techniques.

QUOTE: “If you make a good website that works well for users then indirectly you can certainly see an effect in ranking.” John Mueller,  Google 2019

If you don’t take that route, you’ll find yourself chased down by Google’s algorithms at some point in the coming year.

This guide (and this entire website) is not about churn and burn type of webspam as that is too risky to deploy on a real business website, which is:

QUOTE: “ like walking into a dark alley, packed with used car salesmen, who won’t show you their cars.” Matt Cutts, Google 2014

UX; User Experience Across Multiple Devices & Screen Resolutions MATTERS

User Experience is important.

When thinking of designing a web-page you are going to have to consider where certain elements appear on that webpage, especially advertisements.

Google will (very infrequently, in my experience) tell you if the ads on your website are annoying users which may impact the organic traffic Google sends you.

Annoying ads on your web pages has long been a problem for users (probably) and Google, too. Even if they do make you money.

What ads are annoying?

  • ‘ the kind that blares music unexpectedly ‘ or
  • ‘ a pop-up on top of the one thing we’re trying to find ‘

Apparently ‘frustrating experiences can lead people to install ad blockers and when ads are blocked publishers can’t make money’.

The video goes on to say:

QUOTE: a survey of hundreds of ad experiences by the Coalition for better ads has shown that people don’t hate all ads just annoying ones eliminating these ads from your site can make a huge difference ‘ Google, 2017

The New Ad Experience Report In Google Search Console

The Ad Experience Report is part of Google Search Console, but I would NOT wait for any notification from Google before you review your own pages and determine if you have too many ads on your page.

The report:

QUOTE: ‘ makes it easy to find annoying ads on your site and replace them with user-friendly ones ‘. Google, 2017

I think there are ranking algorithms out there that deal with too many ads on the page.

Avoid them by NOT having too many ads on the page.

UX; Which type of Adverts Annoys Users?

Google states:

QUOTE: “The Ad Experience Report is designed to identify ad experiences that violate the Better Ads Standards, a set of ad experiences the industry has identified as being highly annoying to users. If your site presents violations, the Ad Experience Report may identify the issues to fix.” Google Webmaster Guidelines 2020

The Better Ads Standards people are focused on the following annoying ads:

Desktop Web Experiences

Which type of ads on desktop devices annoy users the most?

  • Pop-up Ads
  • Auto-playing Video Ads with Sound
  • Prestitial Ads with Countdown
  • Large Sticky Ads

Mobile Web Experiences

Which type of ads on mobile devices annoy users the most?

  • Pop-up Ads
  • Auto-playing Video Ads with Sound
  • Prestitial Ads
  • Postitial Ads with Countdown
  • Ad Density Higher Than 30%
  • Full-screen Scroll through Ads
  • Flashing Animated Ads
  • Large Sticky Ads

Google says in the video:

QUOTE: “Fixing the problem depends on the issue you have. For example, if it’s a pop-up, you’ll need to remove all the pop-up ads from your site. But if the issue is high ad density on a page, you’ll need to reduce the number of ads. Once you fix the issues, you can submit your site for a re-review. We’ll look at a new sample of pages and may find ad experiences that were missed previously. We’ll email you when the results are in.” Google, 2017

Google offers some solutions to using pop-ups if you are interested

QUOTE: “In place of a pop-up try a full-screen inline ad. It offers the same amount of screen real estate as pop-ups without covering up any content. Fixing the problem depends on the issue you have for example if it’s a pop-up you’ll need to remove all the pop-up ads from your site but if the issue is high ad density on a page you’ll need to reduce the number of ads” Google, 2017

Your Website Will Receive A LOW RATING If It Has Annoying Or Distracting Ads or annoying Secondary Content (SC)

Google has long warned about web page advertisements and distractions on a web page that results in a poor user experience.

The following specific examples are taken from the Google Search Quality Evaluator Guidelines 2017.

6.3 Distracting/Disruptive/Misleading Titles, Ads, and Supplementary Content

QUOTE: “Some Low-quality pages have adequate MC (main content on the page) present, but it is difficult to use the MC due to disruptive, highly distracting, or misleading Ads/SC. Misleading titles can result in a very poor user experience when users click a link only to find that the page does not match their expectations.” Google Search Quality Evaluator Guidelines 2015

6.3.1 Ads or SC that disrupt the usage of MC

QUOTE: “We expect Ads and SC to be visible. However, some Ads, SC, or interstitial pages (i.e., pages displayed before or after the content you are expecting) make it difficult to use the MC. Pages with Ads, SC, or other features that distract from or interrupt the use of the MC should be given a Low rating.” Google Search Quality Evaluator Guidelines 2019

Google gave some examples:

  • QUOTE: “‘Ads that actively float over the MC as you scroll down the page and are difficult to close. It can be very hard to use MC when it is actively covered by moving, difficult­-to-­close Ads.’
  • QUOTE: “‘An interstitial page that redirects the user away from the MC without offering a path back to the MC.’

6.3.2 Prominent presence of distracting SC or Ads

Google said:

QUOTE: ““Users come to web pages to use the MC. Helpful SC and Ads can be part of a positive user experience, but distracting SC and Ads make it difficult for users to focus on and use the MC.

Some webpages are designed to encourage users to click on SC that is not helpful for the purpose of the page. This type of SC is often distracting or prominently placed in order to lure users to highly monetized pages.

Either porn SC or Ads containing porn on non­Porn pages can be very distracting or even upsetting to users. Please refresh the page a few times to see the range of Ads that appear, and use your knowledge of the locale and cultural sensitivities to make your rating. For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits. However, an extremely graphic porn ad may warrant a Low (or even Lowest) rating.” Google Search Quality Evaluator Guidelines 2017

6.3.3 Misleading Titles, Ads, or SC

Google said:

QUOTE: “It should be clear what parts of the page are MC, SC, and Ads. It should also be clear what will happen when users interact with content and links on the webpage. If users are misled into clicking on Ads or SC, or if clicks on Ads or SC leave users feeling surprised, tricked or confused, a Low rating is justified.

  • At first glance, the Ads or SC appear to be MC. Some users may interact with Ads or SC, believing that the Ads or SC is the MC.Ads appear to be SC (links) where the user would expect that clicking the link will take them to another page within the same website, but actually take them to a different website. Some users may feel surprised or confused when clicking SC or links that go to a page on a completely different website.
  • Ads or SC that entice users to click with shocking or exaggerated titles, images, and/or text. These can leave users feeling disappointed or annoyed when they click and see the actual and far less interesting content.
  • Titles of pages or links/text in the SC that are misleading or exaggerated compared to the actual content of the page. This can result in a very poor user experience when users read the title or click a link only to find that the page does not match their expectations. “ Google Search Quality Evaluator Guidelines 2017

The important thing to know here is:

QUOTE: “Summary: The Low rating should be used for disruptive or highly distracting Ads and SC. Misleading Titles, Ads, or SC may also justify a Low rating. Use your judgment when evaluating pages. User expectations will differ based on the purpose of the page and cultural norms.” Google Search Quality Evaluator Guidelines 2017

… and that Google does not send free traffic to sites it rates as low quality.

QUOTE: “Important: The Low rating should be used if the page has Ads, SC, or other features that interrupt or distract from using the MC.” Google Search Quality Evaluator Guidelines 2019

Recommendation: Remove annoying ADS/SC/CTA from your site. Be extremely vigilant that your own CTA for your own service doesn’t get in the way of a user consuming the main content either.

I believe your own CTA (Call-To-Action) on your own site can be treated much like ADs and SC are, and open to the same abuse and same punishments that Ads are, depending on the page-type we are talking about.

Balancing Conversions With Usability & User Satisfaction

Take pop-up windows or window pop-unders as an example:

According to usability expert Jakob Nielsen, 95% of website visitors hated unexpected or unwanted pop-up windows, especially those that contain unsolicited advertising.

In fact, Pop-Ups have been consistently voted the Number 1 Most Hated Advertising Technique since they first appeared many years ago.

QUOTE: “Some things don’t change — users’ expectations, in particular. The popups of the early 2000s have reincarnated as modal windows, and are hated just as viscerally today as they were over a decade ago. Automatically playing audio is received just as negatively today. The following ad characteristics remained just as annoying for participants as they were in the early 2000s: Pops up – Slow loading time – Covers what you are trying to see – Moves content around – Occupies most of the page – Automatically plays sound.” Therese Fessenden, Nielsen Norman Group 2017

Website accessibility aficionados will point out:

  • creating a new browser window should be the authority of the user
  • pop-up new windows should not clutter the user’s screen.
  • all links should open in the same window by default. (An exception, however, may be made for pages containing a links list. It is convenient in such cases to open links in another window so that the user can come back to the links page easily. Even in such cases, it is advisable to give the user a prior note that links would open in a new window).
  • Tell visitors they are about to invoke a pop-up window (using the link <title> attribute)
  • Popup windows do not work in all browsers.
  • They are disorienting for users
  • Provide the user with an alternative.

It is, however, an inconvenient truth for accessibility and usability aficionados to hear that pop-ups can be used successfully to vastly increase signup subscription conversions.

QUOTE: “While, as a whole, web usability has improved over these past several years, history repeats and designers make the same mistakes over and over again. Designers and marketers continuously need to walk a line between providing a good user experience and increasing advertising revenue. There is no “correct” answer or golden format for designers to use in order to flawlessly reach audiences; there will inevitably always be resistance to change and a desire for convention and predictability. That said, if, over the course of over ten years, users are still lamenting about the same problems, it’s time we start to take them seriously.”  Therese Fessenden, Nielsen Norman Group 2017

EXAMPLE: TEST With Using A Pop Up Window

Pop-ups suck, everybody seems to agree. Here’s the little test I carried out on a subset of pages, an experiment to see if pop-ups work on this site to convert more visitors to subscribers.

I  tested it out when I didn’t blog for a few months and traffic was very stable.

Results:

Testing Pop Up Windows Results

Pop Up WindowTotal %Change
WK1 OnWk2 Off
Mon4620173%
Tue4823109%
Wed4115173%
Thu4823109%
Fri5217206%

That’s a fair increase in email subscribers across the board in this small experiment on this site. Using a pop up does seem to have an immediate impact.

I have since tested it on and off for a few months and the results from the small test above have been repeated over and over.

I’ve tested different layouts and different calls to actions without pop-ups, and they work too, to some degree, but they typically take a bit longer to deploy than activating a plugin.

I don’t really like pop-ups as they have been an impediment to web accessibility but it’s stupid to dismiss out-of-hand any technique that works.

In my tests, using pop-ups really seemed to kill how many people share a post in social media circles.

With Google now showing an interest with interstitials (especially on mobile versions of your site), I would be very nervous about employing a pop-up window that obscures the primary reason for visiting the page.

If Google detects any user dissatisfaction, this can be very bad news for your rankings.

QUOTE: “Interstitials that hide a significant amount of content provide a bad search experience” Google, 2015

I am, at the moment, using an exit strategy pop-up window as hopefully by the time a user sees this device, they are FIRST satisfied with my content they came to read. I can recommend this as a way to increase your subscribers, at the moment, with a similar conversion rate than pop-ups – if NOT BETTER.

I think it is sensible to convert customers without using techniques that potentially negatively impact Google rankings.

Do NOT let conversion get in the way of the PRIMARY reason a Google visitor is on ANY PARTICULAR PAGE or you risk Google detecting relative dissatisfaction with your site and that is not going to help you as Google’s gets better at working out what ‘quality’ actually means.

How To Fix Issues Found In the Google Ad Experience Report

Chances are, you will NOT receive a message in Search Console for this. I’ve happened across very few, even when I knew there were too many ads on pages, and there probably should be a message. I think the algorithm sorts this stuff out.

If you have a message from Google:

  • you will need to sign up for Google Search Console
  • review the Ad experience report
  • if your site hasn’t been reviewed or as past review, the report won’t show anything
  • if your review status is warning or failing violations will be listed in the “what we found” column “ad reviews report” on a sample of pages from both desktop and mobile versions of your site
  • ‘if negative ad experiences are found they are listed separately in the report since a bad experience on mobile may not be as annoying on desktop’
  • Google will highlight ‘site design issues such as pop-ups or large sticky ads‘ and rather cleverly will show you ‘a video of the ad that was flagged
  • ‘Creative issues are shown on your site through ad tags like flashing animated ads or autoplay videos with sound’
  • remove annoying ads from your site
  • submit your site for a review of your ad experience in Search Console.

UX; Where You Place Ads On Your Page Can Impact Rankings

QUOTE: “For years, the user experience has been tarnished by irritating and intrusive ads. Thanks to extensive research by the Coalition for Better Ads, we now know which ad formats and experiences users find the most annoying. Working from this data, the Coalition has developed the Better Ads Standards, offering publishers and advertisers a road map for the formats and ad experiences to avoid.”  Kelsey LeBeau, Google 2019

I proceed thinking conversion points on a page (ADs or CTA) must be thought about as an AD and placed in the most appropriate place on the page following the principles Google has told us about.

I recommend ANY FUTURE CONVERSION PRINCIPLES added to your site should be checked against these guidelines:

ONLY Place conversion points (CTA – CALL TO ACTIONS) in the GREEN places and BEWARE placing any CTA in any bad ad placement area, on mobile or desktop.

UX; Google has a ‘Page-Heavy’ Penalty Algorithm

Present your most compelling material above the fold at any resolution – Google also has a ‘Page Heavy Algorithm’ – In short, if you have too many ads on your page, or if paid advertising obfuscates copy or causes an otherwise frustrating user experience for Google’s visitors, your page can be demoted in SERPs:

QUOTE: ‘sites that don’t have much content “above-the-fold” can be affected.’  Matt Cutts, Google 2012:

Keep an eye on where you put your ads, sponsored content or CTA – get in the way of your main content copy of the page you are designing and you could see organic traffic decline.

UX; Google has an ‘Interstitial and Pop-Up‘ Penalty Algorithm

Bear in mind also Google now (since January 2017) has an Interstitial and Pop-Up ‘penalty so AVOID creating a marketing strategy that relies on this.

QUOTE: “Here are some examples of techniques that make content less accessible to a user:

(1) Showing a popup that covers the main content, either immediately after the user navigates to a page from the search results, or while they are looking through the page.

(2) Displaying a standalone interstitial that the user has to dismiss before accessing the main content.

(3) Using a layout where the above-the-fold portion of the page appears similar to a standalone interstitial, but the original content has been inlined underneath the fold. Google” Doantam Phan, Google 2016

EXIT POP-UPS (like the one I use on this site) do not seem to interfere with a readers enjoyment and access to the primary content on a page. At the moment, this type of pop-up seems to be OK for now (and do increase subscriber signups, too). I have a feeling an exit-pop up might have a negative impact on some rankings, though; the jury is out.

UX; “Page Experience” Is The New “User Experience

I think this is a good move to call this “page experience” and not confuse it with “user experience”:

QUOTE: “We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.” Sowmya Subramanian, Google 2020

This is an incredibly important move by Google. Optimise your core web vitals now.

Google has switched to a ‘Mobile First‘ Index

Now that Google is determined to focus on ranking sites based on their mobile experience, the time is upon businesses to REALLY focus on delivering the fastest and most accessible DESKTOP and MOBILE friendly experience you can achieve.

Because if you DO NOT, your competition will, and Google may rank those pages above your own, in time.

QUOTE: “To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices. If you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything.” Doantam Phan, Google 2017

Do not think you can just ignore the desktop version of your site, even if you are now mobile first, according to Google Search Console.

QUOTE: “Google uses two different crawlers for crawling websites: a mobile crawler and a desktop crawler. Each crawler type simulates a user visiting your page with a device of that type. Google uses one crawler type (mobile or desktop) as the primary crawler for your site. All pages on your site that are crawled by Google are crawled using the primary crawler. The primary crawler for all new websites is the mobile crawler. In addition, Google recrawls a few pages on your site with the other crawler type (mobile or desktop). This is called the secondary crawl, and is done to see how well your site works with the other device type.” Google Webmaster Guidelines, 2020

I recently vastly improved a site rankings by dealing with poor AD/CTA placement issues across a website DESKTOP version.

What Are YMYL Pages?

QUOTE: “Some types of pages or topics could potentially impact a person’s future happiness, health, financial stability, or safety. We call such pages “Your Money or Your Life” pages, or YMYL.” Google Search Quality Evaluator Guidelines 2019

YMYL pages are, I think, what a lot of Google algorithms ultimately focus on dealing with.

Google classifies web pages that “potentially impact a person’s future happiness, health, financial stability, or safety” as “Your Money or Your Life” pages (YMYL) and hold these types of pages to higher standards than, for instance, hobby and informational sites.

Essentially, if you are selling something to visitors or advising on important matters like finance, law or medical advice – your page will be held to this higher standard.

YMYL is a classification of certain pages by Google where Google explains:

QUOTE: “News and current events: news about important topics such as international events, business, politics, science, technology, etc. Keep in mind that not all news articles are necessarily considered YMYL (e.g., sports, entertainment, and everyday lifestyle topics are generally not YMYL). Please use your judgment and knowledge of your locale. ● Civics, government, and law: information important to maintaining an informed citizenry, such as information about voting, government agencies, public institutions, social services, and legal issues (e.g., divorce, child custody, adoption, creating a will, etc.). ● Finance: financial advice or information regarding investments, taxes, retirement planning, loans, banking, or insurance, particularly webpages that allow people to make purchases or transfer money online. ● Shopping: information about or services related to research or purchase of goods/services, particularly webpages that allow people to make purchases online. ● Health and safety: advice or information about medical issues, drugs, hospitals, emergency preparedness, how dangerous an activity is, etc. ● Groups of people: information about or claims related to groups of people, including but not limited to those grouped on the basis of race or ethnic origin, religion, disability, age, nationality, veteran status, sexual orientation, gender or gender identity. ● Other: there are many other topics related to big decisions or important aspects of people’s lives which thus may be considered YMYL, such as fitness and nutrition, housing information, choosing a college, finding a job, etc.” Google Search Quality Evaluator Guidelines 2019

As soon as you have YMYL INTENT pages on your page/site, Google will detect this, will treat this site differently and will treat all pages linking to this page differently.

You have been warned!

What Is E.A.T.?

Google aims to rank pages where the author has some demonstrable expertise on experience in the subject-matter they are writing about. These ‘quality ratings’ (performed by human evaluators) are based on (E.A.T. or EAT or E-A-T) which is simply  ‘Expertise, Authoritativeness, Trustworthiness‘ of the ‘Main Content of a page

QUOTE: “Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. …. Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important..” Google Search Quality Evaluator Guidelines 2019

and

QUOTE: “Keep in mind that there are high E-A-T pages and websites of all types, even gossip websites, fashion websites, humor websites, forum and Q&A pages, etc. In fact, some types of information are found almost exclusively on forums and discussions, where a community of experts can provide valuable perspectives on specific topics. ● High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis. ● High E-A-T news articles should be produced with journalistic professionalism—they should contain factually accurate content presented in a way that helps users achieve a better understanding of events. High E-A-T news sources typically have published established editorial policies and robust review processes (example 1, example 2). ● High E-A-T information pages on scientific topics should be produced by people or organizations with appropriate scientific expertise and represent well-established scientific consensus on issues where such consensus exists. ● High E-A-T financial advice, legal advice, tax advice, etc., should come from trustworthy sources and be maintained and updated regularly. ● High E-A-T advice pages on topics such as home remodeling (which can cost thousands of dollars and impact your living situation) or advice on parenting issues (which can impact the future happiness of a family) should also come from “expert” or experienced sources that users can trust. ● High E-A-T pages on hobbies, such as photography or learning to play a guitar, also require expertise. Some topics require less formal expertise. Many people write extremely detailed, helpful reviews of products or restaurants. Many people share tips and life experiences on forums, blogs, etc. These ordinary people may be considered experts in topics where they have life experience. If it seems as if the person creating the content has the type and amount of life experience to make him or her an “expert” on the topic, we will value this “everyday expertise” and not penalize the person/webpage/website for not having “formal” education or training in the field.” Google Search Quality Evaluator Guidelines 2019

Who links to you can also inform the E-A-T of your website.

Consider this:

QUOTE: “I asked Gary (Illyes from Google) about E-A-T. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that’s good. He recommended reading the sections in the QRG on E-A-T as it outlines things well.” Marie Haynes, Pubcon 2018

Google is still a ‘link-based’ search engine ‘under-the-hood’ but it takes so much more to stick a website at the top of search engine results pages (SERPs) than it used to.

What is the PURPOSE of your page?

This is incredibly important:

QUOTE: “The purpose of a page is the reason or reasons why the page was created. Every page on the Internet is created for a purpose, or for multiple purposes. Most pages are created to be helpful for users, thus having a beneficial purpose. Some pages are created merely to make money, with little or no effort to help users. Some pages are even created to cause harm to users. The first step in understanding a page is figuring out its purpose.” Google Search Quality Evaluator Guidelines 2019

Is it to “sell products or services”, “to entertain” or “ to share information about a topic.”

MAKE THE PURPOSE OF YOUR PAGE SINGULAR and OBVIOUS to help quality raters and algorithms.

The name of the game (if you’re not faking everything) is VISITOR SATISFACTION.

If a visitor lands on your page – are they satisfied and can they successfully complete WHY they are there? Are you trying to hard-sell visitors as soon as they land on your information-type page?

TIP: On information-type pages, test out different layouts where you place the content front and centre.

If A Page Exists Only To Make Money Online, The Page Is Spam, to Google

QUOTE: “If A Page Exists Only To Make Money, The Page Is Spam” Google Quality Rater Guide 2011

That statement above in the original quality rater guidelines is standout and should be a heads up to any webmaster out there who thinks they are going to make a “fast buck” from Google organic listings.

It should, at least, make you think about the types of pages you are going to spend your valuable time making.

Without VALUE ADD for Google’s users – don’t expect to rank high for commercial keywords.

If you are making a page today with the sole purpose of making money from it – and especially with free traffic from Google – you obviously didn’t get the memo.

Consider this statement from a manual reviewer:

QUOTE: “…when they DO get to the top, they have to be reviewed with a human eye in order to make sure the site has quality.” Google Human Rater, 2011

It’s worth remembering:

  • If A Page Exists Only To Make Money, The Page Is Spam
  • If A Site Exists Only To Make Money, The Site Is Spam

This is how what you make will be judged – whether it is fair to you or not.

QUOTE: “What makes a page spammy?: “Hidden text or links – may be exposed by selecting all page text and scrolling to the bottom (all text is highlighted), disabling CSS/Javascript, or viewing source code. Sneaky redirects – redirecting through several URLs, rotating destination domains cloaking with JavaScript redirects and 100% frame. Keyword stuffing – no percentage or keyword density given; this is up to the rater. PPC ads that only serve to make money, not help users. Copied/scraped content and PPC ads. Feeds with PPC ads. Doorway pages – multiple landing pages that all direct user to the same destination. Templates and other computer-generated pages mass-produced, marked by copied content and/or slight keyword variations. Copied message boards with no other page content. Fake search pages with PPC ads. Fake blogs with PPC ads, identified by copied/scraped or nonsensical spun content. Thin affiliate sites that only exist to make money, identified by checkout on a different domain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question. Pure PPC pages with little to no content. Parked domains” Miranda Miller, SEW, 2011

It isn’t all bad news…

QUOTE: “An infinite number of niches are waiting for someone to claim them. I’d ask yourself where you want to be, and see if you can find a path from a tiny specific niche to a slightly bigger niche and so on, all the way to your desired goal. Sometimes it’s easier to take a series of smaller steps instead of jumping to your final goal in one leap.” Matt Cutts, Google 2006

Google does levels the playing field in some areas especially if you are willing to:

  • Differentiate yourself
  • Be Remarkable
  • Be accessible
  • Add unique content to your site
  • Help users in an original way
  • Not over-emphasise conversion-rate practices on information type content

Google doesn’t care about the vast majority of websites but the search engine giant DOES care about HELPING ITS OWN USERS.

So, if you are helping visitors the come from Google – and not by just directing them to another website – you are probably doing one thing right at least.

With this in mind – I am already building affiliate pages differently, for instance.

In my experience, INFORMATION-TYPE pages are treated differently to (even) INFORMATION-RICH YMYL pages, simply because of the obvious intent to bait and switch or monetise free traffic from Google

I proceed thinking monetising a page effectively turns it into a YMYL-Light page (which means you better have a lot of E.A.T. to make it fly).

Doorway Pages Are Spam

Google algorithms consistently target sites with doorway pages in quality algorithm updates. The definition of a “doorway page” can change over time.

For example in the images below (from 2011), all pages on the site seemed to be hit with a -50+ ranking penalty for every keyword phrase the website ranked for.

At first Google rankings for commercial keyword phrases collapsed which led to somewhat of a “traffic apocalypse“:

Google detected doorway pages penalty

The webmaster then received an email from Google via Google Webmaster Tools (now called Google Search Console):

QUOTE: “Google Webmaster Tools notice of detected doorway pages on xxxxxxxx – Dear site owner or webmaster of xxxxxxxx, We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages – groups of “cookie cutter” or low-quality pages…. We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.” Google Search Quality Team, 2011

At the time, I didn’t immediately class the pages on the affected sites in question as doorway pages. It’s evident Google’s definition of a doorways changes over time.

And it has! Google has been very clear in recent years.

A lot of people do not realise they are building what Google classes as doorway pages….. and it was indicative to me that ….. what you intend to do with the traffic Google sends you may in itself, be a ranking factor not too often talked about.

What Does Google Classify As Doorway Pages?

Google classes many types of pages as doorway pages.

Doorway pages can be thought of as lots of pages on a website designed to rank for very specific keywords using minimal original text content e.g. location pages often end up looking like doorway pages.

In the recent past, location-based SERPs were often lower-quality, and so Google historically ranked location-based doorway pages in many instances.

There is some confusion for real businesses who THINK they SHOULD rank for specific locations where they are not geographically based and end up using doorway-type pages to rank for these locations.

What Google Says About Doorway Pages

Google said a few years ago:

QUOTE: “For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn’t like it, and then tries the next result in the search results page and is taken to that same site that they didn’t like, that’s a really frustrating experience.” Brian White, Google 2015

A question about using content spread across multiple pages and targeting different geographic locations on the same site was asked in the recent Hangout with Google’s John Mueller:

QUOTE: “We are a health services comparison website…… so you can imagine that for the majority of those pages the content that will be presented in terms of the clinics that will be listed looking fairly similar right and the same I think holds true if you look at it from the location …… we’re conscious that this causes some kind of content duplication so the question is is this type … to worry about? ” Webmaster Question, 2017

Bearing in mind that (while it is not the optimal use of pages) Google does not ‘penalise’ a website for duplicating content across internal pages in a non-malicious way, John’s clarification of location-based pages on a site targeting different regions is worth noting:

QUOTE: “For the most part it should be fine I think the the tricky part that you need to be careful about is more around doorway pages in the sense that if all of these pages end up with the same business then that can look a lot like a doorway page but like just focusing on the content duplication part that’s something that for the most part is fine what will happen there is will index all of these pages separately because from  from a kind of holistic point of view these pages are unique they have unique content on them they might have like chunks of text on them which are duplicated but on their own these pages are unique so we’ll index them separately and in the search results when someone is searching for something generic and we don’t know which of these pages are the best ones we’ll pick one of these pages and show that to the user and filter out the other variations of that that page so for example if someone in Ireland is just looking for dental bridges and you have a bunch of different pages for different kind of clinics that offer the service and probably will pick one of those pages and show those in the search results and filter out the other ones.

But essentially the idea there is that this is a good representative of the the content from your website and that’s all that we would show to users on the other hand if someone is specifically looking for let’s say dental bridges in Dublin then we’d be able to show the appropriate clinic that you have on your website that matches that a little bit better so we’d know dental bridges is something that you have a lot on your website and Dublin is something that’s unique to this specific page so we’d be able to pull that out and to show that to the user like that so from a pure content duplication point of view that’s not really something I totally worry about.

I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.” John Mueller, Google 2017

The takeaway here is that if you have LOTS of location pages serving ONE SINGLE business in one location, then those are very probably classed as some sort of doorway pages, and probably old-school techniques for these type of pages will see them classed as lower-quality – or even – spammy pages.

Google has long warned webmasters about using Doorway pages but many sites still employ them, because, either:

  • their business model depends on it for lead generation
  • the alternative is either a lot of work or
  • they are not creative enough or
  • they are not experienced enough to avoid the pitfalls of having lower-quality doorway pages on a site or
  • they are experienced enough to understand what impact they might be having on any kind of site quality score

Google has a doorway page algorithm which no doubt they constantly improve upon. Google warned:

QUOTE: “Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.” Google, 2015

If you have location pages that serve multiple locations or businesses, then those are not doorway pages and should be improved uniquely to rank better, according to John’s advice.

Are You Making Doorway Pages?

Search Engine Land offered this clarification from Google:

QUOTE: “How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions:

  • Is the purpose to optimise for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
  • Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
  • Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
  • Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
  • Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?” Barry Schwartz, 2015

Also:

QUOTE: “Well, a doorway page would be if you have a large collection of pages where you’re just like tweaking the keywords on those pages for that.

I think if you focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword then that’s that’s usually something that leads to a reasonable result.

Whereas if you’re just taking a list of keywords and saying I need to make pages for each of these keywords and each of the permutations that might be for like two or three of those keywords then that’s just creating pages for the sake of keywords which is essentially what we look at as a doorway.” Barry Schwartz, 2015

Note I highlighted the following statement:

QUOTE: “focus on like a clear purpose for the page that’s outside of just I want to rank for this specific variation of the keyword.”

That is because sometimes, often, in fact, there is an alternative to doorway pages for location pages that achieve essentially the same thing for webmasters.

Naturally, business owners want to rank for lots of keywords in organic listings with their website. The challenge for webmasters is that Google doesn’t want business owners to rank for lots of keywords using auto-generated content especially when that produces A LOT of pages on a website using (for instance) a list of keyword variations page-to-page.

QUOTE: “7.4.3 Automatically ­Generated Main Content Entire websites may be created by designing a basic template from which hundreds or thousands of pages are created, sometimes using content from freely available sources (such as an RSS feed or API). These pages are created with no or very little time, effort, or expertise, and also have no editing or manual curation. Pages and websites made up of auto­generated content with no editing or manual curation, and no original content or value added for users, should be rated Lowest.” Google Search Quality Evaluator Guidelines, 2017

The end-result is webmasters create doorway pages without even properly understanding what they represent to Google and without realising Google will not index all these auto-generated pages:

QUOTE: “Doorway pages (bridge pages, portal pages, jump pages, gateway pages or entry pages) are web pages that are created for the deliberate manipulation of search engine indexes (spamdexing).” Wikipedia, 2020

Also:

QUOTE: In digital marketing and online advertising, spamdexing (web spam)…. is the deliberate manipulation of search engine indexes” Wikipedia, 2020

It is interesting to note that Wikipedia might make clear distinctions between what doorway page is, what spamdexing is and what a landing page is…

QUOTE: “Landing pages are regularly misconstrued to equate to Doorway pages within the literature. The former are content rich pages to which traffic is directed within the context of pay-per-click campaigns…” Wikipedia, 2020

For me, Google blurs that line here.

Google has declared:

QUOTE: Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.

Here are some examples of doorways:

  • Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
  • Pages generated to funnel visitors into the actual usable or relevant portion of your site(s)
  • Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy”
    Google Webmaster Guidelines, 2020

I’ve made bold:

QUOTE: “Doorways are sites or pages created to rank highly for specific search queries” Google Webmaster Guidelines, 2020

Take note: It is not just location pages that are classed as doorway pages:

QUOTE: “For Google, that’s probably overdoing it and ends up in a situation you basically create a doorway site …. with pages of low value…. that target one specific query.” John Mueller 2018

If your website is made up of lower-quality doorway type pages using old techniques (which more and more labelled as spam) then Google will not index all of the pages and your website ‘quality score’ is probably going to be negatively impacted.

Google’s John Mueller advised someone that:

QUOTE: “he should not go ahead and build out 1,300 city based landing pages, with the strategy of trying to rank for your keyword phrase + city name. He said that would be a doorway page and against Google’s guidelines.” Barry Schwartz, 2019

Summary:

If you are making keyword rich location pages for a single business website, there’s a risk these pages will be classed doorway pages.

If you know you have VERY low-quality doorway pages on your site, you should remove them or rethink your entire strategy if you want to rank high in Google for the long term.

Location-based pages are suitable for some kind of websites, and not others.

What Does Google Mean By “Low-Quality“?

Google has a history of classifying your site as some type of entity, and whatever that is, you don’t want a low-quality label on it. Put there by algorithm or human. Manual evaluators might not directly impact your rankings, but any signal associated with Google marking your site as low-quality should probably be avoided.

If you are making websites to rank in Google without unnatural practices, you are going to have to meet Google’s expectations in the Quality Raters Guidelines.

Google says:

QUOTE: “Low-quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well.” Google Search Quality Evaluator Guidelines 2019

‘Sufficient Reason’

There is ‘sufficient reason’ in some cases to immediately mark the page down in some areas, and Google directs quality raters to do so:

  • QUOTE: “An unsatisfying amount of MC is a sufficient reason to give a page a Low-quality rating.
  • QUOTE: “Low-quality MC is a sufficient reason to give a page a Low-quality rating.
  • QUOTE: “Lacking appropriate E-A-T is sufficient reason to give a page a Low-quality rating.
  • QUOTE: “Negative reputation is sufficient reason to give a page a Low-quality rating.

What are low-quality pages?

When it comes to defining what a low-quality page is, Google is evidently VERY interested in the quality of the Main Content (MC) of a page:

Main Content (MC)

Google says MC should be the ‘main reason a page exists’.

  • QUOTE: “The quality of the MC is low.
  • QUOTE: “There is an unsatisfying amount of MC for the purpose of the page.
  • QUOTE: “There is an unsatisfying amount of website information.

POOR MC & POOR USER EXPERIENCE

  • QUOTE: “This content has many problems: poor spelling and grammar, complete lack of editing, inaccurate information. The poor quality of the MC is a reason for the Lowest+ to Low rating. In addition, the popover ads (the words that are double underlined in blue) can make the main content difficult to read, resulting in a poor user experience.
  • QUOTE: “Pages that provide a poor user experience….. should also receive low ratings, even if they have some images appropriate for the query.

DESIGN FOCUS NOT ON MC

  • QUOTE: “If a page seems poorly designed, take a good look. Ask yourself if the page was deliberately designed to draw attention away from the MC. If so, the Low rating is appropriate.
  • QUOTE: “The page design is lacking. For example, the page layout or use of space distracts from the MC, making it difficult to use the MC.

MC LACK OF AUTHOR EXPERTISE

  • QUOTE: “You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.
  • QUOTE: “There is no evidence that the author has medical expertise. Because this is a YMYL medical article, lacking expertise is a reason for a Low rating.
  • QUOTE: “The author of the page or website does not have enough expertise for the topic of the page and/or the website is not trustworthy or authoritative for the topic. In other words, the page/website is lacking E-A-T.

After page content, the following are given the most weight in determining if you have a high-quality page.

POOR SECONDARY CONTENT

  • QUOTE: “Unhelpful or distracting SC that benefits the website rather than helping the user is a reason for a Low rating.
  • QUOTE: “The SC is distracting or unhelpful for the purpose of the page.
  • QUOTE: “The page is lacking helpful SC.
  • QUOTE: “For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.

DISTRACTING ADVERTISEMENTS

  • QUOTE: “For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits, however, an extremely distracting and graphic porn ad may warrant a Low rating.

GOOD HOUSEKEEPING

  • QUOTE: “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.
  • QUOTE: “The website is lacking maintenance and updates.

SERP SENTIMENT & NEGATIVE REVIEWS

  • QUOTE: “Credible negative (though not malicious or financially fraudulent) reputation is a reason for a Low rating, especially for a YMYL page.
  • QUOTE: “The website has a negative reputation.

LOWEST RATING

When it comes to Google assigning your page the lowest rating, you are probably going to have to go some to hit this, but it gives you a direction you want to ensure you avoid at all costs.

Google says throughout the document, that there are certain pages that…

QUOTE: “should always receive the Lowest rating” Google, 2017

..and these are presented below. Note – These statements below are spread throughout the raters document and not listed the way I have listed them here. I don’t think any context is lost presenting them like this, and it makes it more digestible.

Anyone familiar with Google Webmaster Guidelines will be familiar with most of the following:

  • True lack of purpose pages or websites.
    • Sometimes it is difficult to determine the real purpose of a page.
  • Pages on YMYL websites with completely inadequate or no website information.
  • Pages or websites that are created to make money with little to no attempt to help users.
  • Pages with extremely low or lowest quality MC.
    • If a page is deliberately created with no MC, use the Lowest rating. Why would a page exist without MC? Pages with no MC are usually lack of purpose pages or deceptive pages.
    • Webpages that are deliberately created with a bare minimum of MC, or with MC which is completely unhelpful for the purpose of the page, should be considered to have no MC
    • Pages deliberately created with no MC should be rated Lowest.
    • Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
  • Pages on YMYL (Your Money Or Your Life Transaction pages) websites with completely inadequate or no website information.
  • Pages on abandoned, hacked, or defaced websites.
  • Pages or websites created with no expertise or pages that are highly untrustworthy, unreliable, unauthoritative, inaccurate, or misleading.
  • Harmful or malicious pages or websites.
    • Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.
    • Deceptive pages or websites. Deceptive webpages appear to have a helpful purpose (the stated purpose), but are actually created for some other reason. Use the Lowest rating if a webpage page is deliberately created to deceive and potentially harm users in order to benefit the website.
    • Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
    • Sometimes, pages just don’t “feel” trustworthy. Use the Lowest rating for any of the following: Pages or websites that you strongly suspect are scams
    • Pages that ask for personal information without a legitimate reason (for example, pages which ask for name, birthdate, address, bank account, government ID number, etc.). Websites that “phish” for passwords to Facebook, Gmail, or other popular online services. Pages with suspicious download links, which may be malware.
  • Use the Lowest rating for websites with extremely negative reputations.

Websites ‘Lacking Care and Maintenance’ Are Rated ‘Low Quality’.

QUOTE: “Sometimes a website may seem a little neglected: links may be broken, images may not load, and content may feel stale or out-dated. If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.”

“Broken” or Non-Functioning Pages Classed As Low Quality

Google gives clear advice on creating useful 404 pages:

  1. QUOTE: “Tell visitors clearly that the page they’re looking for can’t be found
  2. QUOTE: “Use language that is friendly and inviting
  3. QUOTE: “Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site.
  4. QUOTE: “Consider adding links to your most popular articles or posts, as well as a link to your site’s home page.
  5. QUOTE: “Think about providing a way for users to report a broken link.
  6. QUOTE: “Make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested

What Are The Low-Quality Signals Google Looks For?

QUOTE: “Low quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well. These pages lack expertise or are not very trustworthy/authoritative for the purpose of the page.” Google Quality Evaluator Guidelines, 2017

These include but are not limited to:

  1. Lots of spammy comments
  2. Low-quality content that lacks EAT signal (Expertise + Authority + Trust”)
  3. NO Added Value for users
  4. Poor page design
  5. Malicious harmful or deceptive practices detected
  6. Negative reputation
  7. Auto-generated content
  8. No website contact information
  9. Fakery or INACCURATE information
  10. Untrustworthy
  11. Website not maintained
  12. Pages just created to link to others
  13. Pages lack purpose
  14. Keyword stuffing
  15. Inadequate customer service pages
  16. Sites that use practices Google doesn’t want you to use

Pages can get a neutral rating too.

Pages that have “Nothing wrong, but nothing special” about them don’t “display characteristics associated with a High rating” and puts you in the middle ground – probably not a sensible place to be a year or so down the line.

Pages Can Be Rated ‘Medium Quality’, Too.

QUOTE: “Medium pages achieve their purpose and have neither high nor low expertise, authoritativeness, and trustworthiness. However, Medium pages lack the characteristics that would support a higher quality rating. Occasionally, you will find a page with a mix of high and low quality characteristics. In those cases, the best page quality rating may be Medium.” Google Quality Evaluator Guidelines, 2017

Quality raters will rate content as medium rating when the author or entity responsible for it is unknown.

If you have multiple editors contributing to your site, you had better have a HIGH EDITORIAL STANDARD.

One could take from all this that Google Quality raters are out to get you if you manage to get past the algorithms, but equally, Google quality raters could be friends you just haven’t met yet.

Somebody must be getting rated highly, right?

Impress a Google Quality rater and get a high rating.

If you are a spammer you’ll be pulling out the stops to fake this, naturally, but this is a chance for real businesses to put their best foot forward and HELP quality raters correctly judge the size and relative quality of your business and website.

Real reputation is hard to fake – so if you have it – make sure it’s on your website and is EASY to access from contact and about pages.

The quality raters handbook is a good training guide for looking for links to disavow, too.

It’s pretty clear.

Google organic listings are reserved for ‘remarkable’ and reputable’ content, expertise and trusted businesses.

A high bar to meet – and one that is designed for you to never quite meet unless you are serious about competing, as there is so much work involved.

I think the inferred message is to call your Adwords rep if you are an unremarkable business.

Thin Content Can Still Rank In Google, Sometimes

Ranking top depends on the query and level of competition for the query.

Google’s high-quality recommendations are often for specific niches and specific searches as most of the web would not meet the very highest requirements.

Generally speaking – real quality will stand out, in any niche with a lack of it, at the moment.

The time it takes for this to happen (at Google’s end) leaves a lot to be desired in some niches and time is something Google has an almost infinite supply of compared to 99% of the businesses on the planet.

You do not need a lot of text to rank in Google for most keywords

Google Is Not Going To Rank Low-Quality Pages When It Has Better Options

QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google, 2016

If you have exact match instances of key-phrases on low-quality pages, mostly these pages won’t have all the compound ingredients it takes to rank high in Google.

Google has many quality algorithms these days that demote content and websites that Google deems as providing a lower-quality user experience.

I was working this, long before I understood it partially enough to write anything about it.

Here is an example of taking a standard page that did not rank for years and then turning it into a topic-oriented resource page designed to fully meet a user’s intent:

Graph: Example traffic levels to a properly optimised page

Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.

Google has algorithms that target low-quality content and these algorithms (I surmise) are actually trained in some part by human quality raters.

Quality Raters Do Not Directly Impact YOUR site

QUOTE: “Ratings from evaluators do not determine individual site rankings.” GOOGLE

While Google is on record as stating these quality raters do not directly influence where you rank (without more senior analysts making a call on the quality of your website, I presume?) – there are some things in this document, mostly of a user experience nature (UX) that webmasters should note going forward.

From my own experience, an unsatisfying page-experience signal can impact rankings even on a REPUTABLE DOMAIN and even with SOUND, SATISFYING CONTENT.

It is easy to imagine all these quality ratings are getting passed along to the engineers at Google in some form (at some stage) to improve future algorithms – and identify borderline cases. This is the ‘quality’ bar I’ve mentioned a couple of times in past posts. Google is always raising the bar – always adding new signals, sometimes, in time, taking signals away.

It helps Google:

  1. satisfy their users
  2. control the bulk of transactional web traffic.

That positioning has always been a win-win for Google – and a recognisable strategy from them after all these years.

Take unnatural links out of the equation (which have a history of trumping most other signals) and you are left with page level, site level and off-site signals.

All of these quality signals will need to be addressed to insulate against “Google Panda” (if that can ever be fully successful, against an algorithm that is modified to periodically “ask questions” of your site and overall quality score).

Google holds different types of sites to different standards for different kinds of keywords which would suggest not all websites need all signals satisfied to rank well in SERPs – not ALL THE TIME.

OBSERVATION – You can have the content and the links – but if your site falls short on even a single user satisfaction signal (even if it is picked up by the algorithm, and not a human reviewer) then your rankings for particular terms could collapse – OR – rankings can be held back – IF Google thinks your organisation, with its resources, or ‘reputation, should be delivering a better user experience to users.

OBSERVATION: In the past, a site often rose (in terms of traffic numbers) in Google, before a Panda ‘penalty’. It may be the case (and I surmise this) that the introduction of a certain change initially artificially raised your rankings for your pages in a way that Google’s algorithms do not approve of, and once that problem is spread out throughout your site, traffic begins to deteriorate or is slammed in a future algorithm update.

Google says about the Search Quality Evaluator Guidelines:

QUOTE: “Many websites are eager to tell users how great they are. Some webmasters have read these rating guidelines and write “reviews” on various review websites. But for Page Quality rating, you must also look for outside, independent reputation information about the website. When the website says one thing about itself, but reputable external sources disagree with what the website says, trust the external sources. ” Google Search Quality Evaluator Guidelines 2019

Surely – that’s NOT a bad thing, to make your site HIGHER QUALITY and correctly MARKETING your business to customers – and search quality raters, in the process.

Black Hats will obviously fake all that (which is why it would be self-defeating of me to publish a handy list of signals to manipulate SERPs that’s not just “unnatural links”).

Businesses that care about the performance in Google organic should be noting ALL the following points very carefully.

This isn’t about manipulating quality Raters – it is about making it EASY for them to realise you are a REAL business, with a GOOD REPUTATION, and have a LEVEL of EXPERTISE you wish to share with the world.

The aim is to create a good user experience, not fake it, just as the aim with link building is too not make your links look natural, but for them to be natural.

The Quality Needed To Rank Always Rises

It’s important to note your website quality is often judged on the quality of competing pages for this keyword phrase. This is still a horse race. A lot of this is all RELATIVE to what YOUR COMPETITION are doing.

How relative? Big sites v small sites? Sites with a lot of links v not a lot of links? Big companies with a lot of staff v small companies with a few staff? Do sites at the top of Google get asked more of? Algorithmically and manually? Just…. because they are at the top? Definitely!

Whether its algorithmic or manual – based on technical, architectural, reputation or content – Google can decide and will decide if your site meets its quality requirements to rank on page one.

The likelihood of you ranking stable at number one is almost non-existent in any competitive niche where you have more than a few players aiming to rank number one.

Not en-masse, not unless you are bending the rules.

My own strategy for visibility over the last few years has been to avoid focusing entirely on ranking for particular keywords and rather improve the search experience of my entire website.

QUOTE: “Building a strong site architecture and providing clear navigation will help search engines index your site quickly and easily. This will also, more importantly, provide visitors with a good experience of using your site and encourage repeat visits. It’s worth considering that Google is increasingly paying attention to user experience.” Search Engine Watch, 2016

The entire budget of my time went on content improvement, content re-organisation, website architecture improvement, and lately, mobile experience improvement and CTA (Call-To-Action) optimisation.

I have technical improvements to core web vitals including speed, usability and accessibility in the pipeline.

In simple terms, I took thin content and made it fat to make old content perform better and defend rankings.

Unsurprisingly, ranking fat content comes with its own challenges as the years go by.

Website Reputation Matters

You can help quality raters EASILY research the reputation of your website especially if you have any positive history.

Make “reputation information about the website” easy to access for a quality rater, as judging the reputation of your website is a large part of what they do.

You will need to monitor, or influence, ‘independent’ reviews about your business – because if reviews are particularly negative – Google will “trust the independent sources”.

Consider a page that highlights your good press, if you have any.

  • Google will consider “positive user reviews as evidence of positive reputation.” so come up with a way to get legitimate positive reviews – and starting on Google would be a good place to start.
  • Google states, “News articles, Wikipedia articles, blog posts, magazine articles, forum discussions, and ratings from independent organizations can all be sources of reputation information” but they also state specifically boasts about a lot of internet traffic, for example, should not influence the quality rating of a web page. What should influence the reputation of a page is WHO has shared it on social media etc. rather than just raw numbers of shares. CONSIDER CREATING A PAGE with nofollow links to good reviews on other websites as proof of excellence.
  • Google wants quality raters to examine subpages of your site and often “the URL of its associated homepage” so ensure your homepage is modern, up to date, informative and largely ON TOPIC with your internal pages.
  • Google wants to know a few things about your website, including:
    • Who is moderating the content on the site
    • Who is responsible for the website
    • Who owns copyright of the content
    • Business details (which is important to have synced and accurate across important social media profiles)
    • When was this content updated?
  • Be careful syndicating other people’s content. Algorithmic duplicate problems aside…..if there is a problem with that content, Google will hold the site it finds content on as ‘responsible’ for that content.
  • If you take money online, in any way, you NEED to have an accessible and satisfying ‘customer service’ type page. Google says, “Contact information and customer service information are extremely important for websites that handle money, such as stores, banks, credit card companies, etc. Users need a way to ask questions or get help when a problem occurs. For shopping websites, we’ll ask you to do some special checks. Look for contact information—including the store’s policies on payment, exchanges, and returns. “ Google urges quality raters to be a ‘detective’ in finding this information about you – so it must be important to them.
  • Keep web pages updated regularly and let users know when the content was last updated. Google wants raters to “search for evidence that effort is being made to keep the website up to date and running smoothly.
  • Google quality raters are trained to be sceptical of any reviews found. It’s normal for all businesses to have mixed reviews, but “Credible, convincing reports of fraud and financial wrongdoing is evidence of extremely negative reputation“.
  • Google asks quality raters to investigate your reputation by searching “giving the example [“ibm.com” reviews –site:ibm.com]: A search on Google for reviews of “ibm.com” which excludes pages on ibm.com.” – So I would do that search yourself and judge for yourself what your reputation is. Very low ratings on independent websites could play a factor in where you rank in the future – ” with Google stating clearly “very low ratings on the BBB site to be evidence for a negative reputation“. Other sites mentioned to review your business include YELP and Amazon. Often – using rich snippets containing schema.org information – you can get Google to display user ratings in the actual SERPs. I noted you can get ‘stars in SERPs’ within two days after I added the code (March 2014).
  • If you can get a Wikipedia page – get one!. Keep it updated too. For the rest of us, we’ll just need to work harder to prove you are a real business that has earned its rankings.
  • If you have a lot of NEGATIVE reviews – expect to be treated as a business with an “Extremely negative reputation” – and back in 2013 – Google mentioned they had an algorithm for this, too. Google has said the odd bad review is not what this algorithm looks for, as bad reviews are a natural part of the web.
  • For quality raters, Google has a Page Quality Rating Scale with 5 rating options on a spectrum of “Lowest, Low, Medium, High, and Highest.”
  • Google says “High-quality pages are satisfying and achieve their purpose well” and has lots of “satisfying” content, written by an expert or authority in their field – they go on to include About Us information” pages, and easy to access “Contact or Customer Service information, etc.
  • Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.
  • Google wants raters to navigate your site and ‘test’ it out to see if it is working. They tell raters to check your shopping cart function is working properly, for instance.
  • Google expects pages to “be edited, reviewed, and updated on a regular basis” especially if they are for important issues like medical information, and states not all pages are held to such standards, but one can expect that Google wants information updated in a reasonable timescale. How reasonable this is, is dependent on the TOPIC and the PURPOSE of the web page RELATIVE to competing pages on the web.
  • Google wants to rank pages by expert authors, not from content farms.
  • You can’t have a great piece of content on a site with a negative reputation and expect it to perform well. A “High rating cannot be used for any website that has a convincing negative reputation.”
  • A very positive reputation can lift your content from “medium” to “high-quality“.
  • Google doesn’t care about ‘pretty‘ over substance and clearly instructs raters to “not rate based on how “nice” the page looks“.
  • Just about every webpage should have a CLEAR way to contact the site manager to achieve a high rating.
  • Highlighting ads in your design is BAD practice, and Google gives clear advice to rate the page LOW – Google wants you to optimise for A SATISFYING EXPERIENCE FIRST, CONVERSION SECOND!
  • Good news for many industries! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
  • If your intent is to deceive, be malicious or present pages with no purpose other than to monetise free traffic with no value ad – Google is not your friend.
  • Domains that are ‘related’ in Whois can lead to a low-quality score, so be careful how you direct people around multiple sites you own.
  • Keyword stuffing your pages is not recommended, even if you do get past the algorithms.
  • Quality raters are on the lookout for content that is “copied with minimal alteration” and crediting the original source is not a way to get around this. Google rates this type of activity low-quality.
  • How can Google trust a page if it is blocked from it or from reading critical elements that make up that page? Be VERY careful blocking Google from important directories (blocking CSS and .js files are very risky these days). REVIEW your ROBOTS.txt and know exactly what you are blocking and why you are blocking it.

What Are The High-Quality Characteristics of a Web Page?

QUOTE: “High quality pages exist for almost any beneficial purpose, from giving information to making people laugh to expressing
oneself artistically to purchasing products or services online. What makes a High quality page? A High quality page should have a beneficial purpose and achieve that purpose well.” Google Search Quality Evaluator Guidelines, 2019

The following are examples of what Google calls ‘high-quality characteristics’ of a page and should be remembered:

  • A satisfying or comprehensive amount of very high-quality” main content (MC)
  • Copyright notifications up to date
  • Functional page design
  • Page author has Topical Authority
  • High-Quality Main Content
  • Positive Reputation or expertise of website or author (Google yourself)
  • Very helpful SUPPLEMENTARY content “which improves the user experience.
  • Trustworthy
  • Google wants to reward ‘expertise’ and ‘everyday expertise’ or experience so you need to make this clear (perhaps using an Author Box or some other widget)
  • Accurate information
  • Ads can be at the top of your page as long as it does not distract from the main content on the page
  • Highly satisfying website contact information
  • Customised and very helpful 404 error pages
  • Awards
  • Evidence of expertise
  • Attention to detail

If Google can detect investment in time and labour on your site – there are indications that they will reward you for this (or at least – you won’t be affected when others are, meaning you rise in Google SERPs when others fall).

What Characteristics Do The Highest Quality Pages Exhibit?

QUOTE: “The quality of the MC is one of the most important criteria in Page Quality rating, and informs the E-A-T of the page. For all types of webpages, creating high quality MC takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. For news articles and information pages, high quality MC must be factually accurate for the topic and must be supported by expert consensus where such consensus exists.” Google Search Quality Evaluator Guidelines, 2019

You obviously want the highest quality ‘score’ possible but looking at the Search Quality Evaluator Guidelines that is a lot of work to achieve.

Google wants to rate you on the effort you put into your website, and how satisfying a visit is to your pages.

  1. QUOTE: Very high or highest quality MC, with demonstrated expertise, talent, and/or skill.
  2. QUOTE: “Very high level of expertise, authoritativeness, and trustworthiness (page and website) on the topic of the page.”
  3. QUOTE: “Very good reputation (website or author) on the topic of the page.”

At least for competitive niches were Google intend to police this quality recommendation, Google wants to reward high-quality pages and “the Highest rating may be justified for pages with a satisfying or comprehensive amount of very high-quality” main content.

If your main content is very poor, with “grammar, spelling, capitalization, and punctuation errors“, or not helpful or trustworthy – ANYTHING that can be interpreted as a bad user experience – you can expect to get a low rating.

QUOTE: “The quality of the MC is an important consideration for PQ rating. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well…. Important: The Low rating should be used if the page has Low quality MC. ” Google Search Quality Evaluator Guidelines, 2019

Note – not ALL “content-light” pages are automatically classed low-quality.

If you can satisfy the user with a page “thin” on text content – you are ok (but probably susceptible to someone building a better page than your, more easily, I’d say).

Google expects more from big brands than they do from a smaller store (but that does not mean you shouldn’t be aiming to meet ALL these high-quality guidelines above).

Note too, that if you violate Google Webmaster recommendations for performance in their indexes of the web – you automatically get a low-quality rating.

If your page has a sloppy design, low-quality main content and too many distracting ads your rankings are very probably going to take a nose-dive.

If a Search Quality Evaluator is subject to a sneaky redirect they are instructed to rate your site low.

Google Quality Algorithm Updates

QUOTE: “We have 3 updates a day in average. I think it’s pretty safe to assume there was one recently…” Gary Illyes, Google 2017

Google has many algorithm updates during a year.

These ‘quality updates’ are very reminiscent of Google Panda updates and often impact many websites at the same time – and often these focus on demoting similar ‘low-quality’ techniques we have been told Panda focuses on.

Usually, Google has 3 or 4 big updates in a year that focus on various things, but they also make changes daily. See this list for a comprehensive list of Google algorithm updates.

QUOTE: “Yes, we do make small changes each day to search. It’s a continual process of improvement. Our core updates only happen 2-4 times per year.Danny Sullivan, Google 2020

Note also Google often releases multiple updates and changes to its own GUI (Graphic User Interface – the actual SERPs) at the same time to keep us all guessing to what is going on (and we do).

What Is Google Panda?

QUOTE: “Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.” Google, 2016

Google Panda aims to rate the quality of your pages and website and is based on things about your site that Google can rate, or algorithmically identify.

QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.”  Gary Illyes, Google 2016

We are told the current Panda is an attempt to basically stop low-quality thin content pages ranking for keywords they shouldn’t rank for.

Panda evolves – signals can come and go – Google can get better at determining quality as a spokesman from Google has confirmed :

QUOTE: So it’s not something where we’d say, if your website was previously affected, then it will always be affected. Or if it wasn’t previously affected, it will never be affected.… sometimes we do change the criteria…. category pages…. (I) wouldn’t see that as something where Panda would say, this looks bad.… Ask them the questions from the Panda blog post….. usability, you need to work on.“ John Mueller, Google.

From my original notes about Google Penguin, I list the original, somewhat abstract and metaphorical Panda ranking ‘factors’ published as a guideline for creating high-quality pages.

I also list these Panda points below (Google employees still publicly refer to this document):

(PS – I have emphasised two of the bullet points below, at the top and bottom because I think it’s easier to understand these points as a question, how to work that question out, and ultimately, what Google really cares about – what their users think.

  • “Would you trust the information presented in this article? (YES or NO)
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature? EXPERTISE
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? LOW-QUALITY CONTENT/THIN CONTENT
  • Would you be comfortable giving your credit card information to this site? (HTTPS? OTHER TRUST SIGNALS (CONTACT/ABOUT / PRIVACY / COPYRIGHT / DISCLOSURES / DISCLAIMERS etc. especially relevant if your site is a YMYL page.)
  • Does this article have spelling, stylistic, or factual errors? (SPELLING + GRAMMAR + CONTENT QUALITY – perhaps wrong dates in content, on old articles, for instance)
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines? (OLD TACTICS|DOORWAY PAGES)
  • Does the article provide original content or information, original reporting, original research, or original analysis? (UNIQUE CONTENT, ORIGINAL RESEARCH & SATISFYING CONTENT)
  • Does the page provide substantial value when compared to other pages in search results? (WHAT’S THE RELATIVE QUALITY OF COMPETITION LIKE FOR THIS TERM?)
  • How much is quality control done on content? (WHEN WAS THIS LAST EDITED? Is CONTENT OUTDATED? IS SUPPLEMENTARY CONTENT OUTDATED (External links and images?))
  • Does the article describe both sides of a story? (IS THIS A PRESS RELEASE?)
  • Is the site a recognized authority on its topic? (EXPERTISE, AUTHORITY, TRUST)
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care? (IS THIS CONTENT BOUGHT FROM A $5 per article content factory? Or is written by an EXPERT or someone with a lot of EXPERIENCE of the subject matter?)
  • Was the article edited well, or does it appear sloppy or hastily produced? (QUALITY CONTROL on EDITORIALS)
  • For a health related query, would you trust information from this site? (EXPERTISE NEEDED)
  • Would you recognize this site as an authoritative source when mentioned by name? (EXPERTISE NEEDED)
  • Does this article provide a complete or comprehensive description of the topic? (Is the page text designed to help a visitor or shake them down for their cash?)
  • Does this article contain insightful analysis or interesting information that is beyond obvious? (LOW QUALITY CONTENT – You know it when you see it)
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend? (Would sharing this page make you look smart or dumb to your friends? This should be reflected in social signals)
  • Does this article have an excessive amount of ads that distract from or interfere with the main content? (OPTIMISE FOR SATISFACTION FIRST – CONVERSION SECOND – do not let the conversion get in the way of satisfying the INTENT of the page. For example – if you rank with INFORMATIONAL CONTENT with a purpose to SERVE those visitors – the visitor should land on your destination page and not be deviated from the PURPOSE of the page – and that was informational, in this example – to educate. SO – educate first – beg for social shares on those articles – and leave the conversion on Merit and slightly more subtle influences rather than massive banners or whatever that annoy users). We KNOW ads (OR DISTRACTING CALL TO ACTIONS) convert well at the top of articles – but Google says it is sometimes a bad user experience. You run the risk of Google screwing with your rankings as you optimise for conversion so be careful and keep everything simple and obvious.
  • Would you expect to see this article in a printed magazine, encyclopedia or book? (Is this a HIGH-QUALITY article?)… no? then expect ranking problems.
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics? (Is this a LOW or MEDIUM QUALITY ARTICLE? LOW WORD COUNTS ACROSS PAGES?)
  • Are the pages produced with great care and attention to detail vs. less attention to detail? (Does this page impress?)
  • Would users complain when they see pages from this site? (WILL THIS PAGE MAKE GOOGLE LOOK STUPID IF IT RANKS TOP?)”

All that sits quite nicely with the information you can read in the Google Search Quality Evaluator Guidelines.

If you fail to meet these standards (even some) your rankings can fluctuate wildly (and often, as Google updates Panda every month we are told and often can spot rolling in).

It all probably correlates quite nicely too, with the type of sites you don’t want links from.

QUOTE: “I think there is probably a misunderstanding that there’s this one site-wide number that Google keeps for all websites and that’s not the case.  We look at lots of different factors and there’s not just this one site-wide quality score that we look at. So we try to look at a variety of different signals that come together, some of them are per page, some of them are more per site, but it’s not the case where there’s one number and it comes from these five pages on your website.” John Mueller, Google 2016

Google is raising the quality bar, and forcing content creators to spend HOURS, DAYS or WEEKS longer on websites if they ‘expect’ to rank HIGH in natural results.

If someone is putting the hours into rank their site through legitimate efforts – Google will want to reward that – because it keeps the barrier to entry HIGH for most other competitors.

Critics will say the higher the barrier to entry is to rank high in Google natural listings the more attractive Google Adwords begins to look to those other businesses.

Google says a quality rater does not affect your site, but if your site gets multiple LOW-QUALITY notices from manual reviewers – that stuff is coming back to get you later, surely.

Example ‘High Quality’ E-commerce Site

Google has the search quality rating guidelines. After numerous ‘leaks’, this previously ‘secretive’ document has now been made available for anyone to download.

This document gives you an idea of the type of quality websites Google wants to display in its search engine results pages.

I use these quality rating documents and the Google Webmaster Guidelines as the foundation of my audits for e-commerce sites.

What are these quality raters doing?

Quality Raters are rating Google’s ‘experiments’ and manually reviewing web pages that are presented to them in Google’s search engine results pages (SERPs). We are told that these ratings don’t impact your site, directly.

QUOTE: “Ratings from evaluators do not determine individual site rankings, but are used help us understand our experiments. The evaluators base their ratings on guidelines we give them; the guidelines reflect what Google thinks search users want.” GOOGLE.

What Does Google class as a high-quality product page on an e-commerce site?

This page and site appear to check all the boxes Google wants to see in a high-quality e-commerce website these days.

This product page is an example of YMYL page exhibiting “A satisfying or comprehensive amount of very high-quality MC (main content)” and “Very high level of expertise, highly authoritative/highly trustworthy for the purpose of the page” with a “Very positive reputation“.

Highest Quality Rating: Shopping (YMYL Page Example)

What Is “Domain Authority“?

QUOTE: “So domain authority….. it’s not really something that we have here at Google.” John Mueller, Google 2017

Domain authority, whether or not is something Google has or not, is an important concept to take note of. Essentially Google ‘trusts’ some websites more than others and you will find that it is easier to rank using some websites than it is others.

Google calls it, ‘online business authority

QUOTE: “Amazon has a lot of “online business authority””…. (Official Google Webmaster Blog)

We conveniently call this effect ‘domain authority’ and it seemed to be related to ‘PageRank’ – the system Google started to rank the web within 1998.

QUOTE: “PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B.” Google

Domain authority is an important ranking phenomenon in Google. Nobody knows exactly how Google calculates, ranks and rates the popularity, reputation, intent or trust of a website, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted e.g. are also cited by popular, reputable and trusted sites.

QUOTE: “Last year, we studied almost one billion web pages and found a clear correlation between referring domains (links from unique websites) and organic search traffic.Joshua Hardwick, AHrefs, 2020

Historically it is a useful metaphor and proxy for quality and sometimes you can use it to work out the likelihood of a site ranking for a particular keyword based on its relative score when compared to competing sites and pages.

Historically sites that had domain authority or online business authority had lots of links to them, hence why link building was so popular a tactic – and counting these links is generally how most 3rd party tools still calculate it a pseudo domain authority score for websites today.

Massive domain authority and ranking ‘trust’ was awarded to very successful sites that had gained a lot of links from credible sources, and other online business authorities too.

We more usually talk about domain trust and domain authority based on the number, type and quality of incoming links to a site.

Examples of trusted, authority domains include Wikipedia, the W3C and Apple. e.g.: these are very successful brands.

How did you take advantage of being an online business authority 10 years ago? You turned the site into an Pagerank Black Hole to horde the benefits of “domain authority” and published lots of content sometimes with little thought to quality.

On any subject.

Because Google would rank it!

I think this ‘quality score’ Google has developed since at least 2010 could be Google’s answer to this sort of historical domain authority abuse.

QUOTE: “And we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons…Matt Cutts, Google 2011

and

QUOTE: “The score is determined from quantities indicating user actions of seeking out and preferring particular sites and the resources found in particular sites. *****A site quality score for a particular site**** can be determined by computing a ratio of a numerator that represents user interest in the site as reflected in user queries directed to the site and a denominator that represents user interest in the resources found in the site as responses to queries of all kinds The site quality score for a site can be used as a signal to rank resources, or to rank search results that identify resources, that are found in one site relative to resources found in another site.” Navneet Panda, Google, 2015

An effect akin to ‘domain authority’ is still visible but this new phenomenon is probably based on-site quality scores, potential authorship value scores, user interest and other classifiers, as well as Pagerank.

QUOTE: “The best thing that you can do here is really make sure that the rest of your website is really as high quality and as fantastic as possible. Because if we see that everything else on your website is really fantastic, then when we find something new, we’ll say, well, probably this is pretty good, too.” John Mueller, Google 2018

This takes a lot of work and a lot of time to create, or even mimic, such a site.

QUOTE: “Brands are the solution, not the problem, Brands are how you sort out the cesspool.” Eric Schmidt, Google 2008

Google is going to present users with sites that are recognisable to them. If you are a ‘brand’ in your space or well-cited site, Google wants to rank your stuff at the top as it won’t make Google look stupid.

Getting links from ‘Brands’ (or well-cited websites) in niches can also mean getting  ‘quality links’.

Easier said than done, for most, of course, but that is the point of link building – to get these types of links – the type of links Google will not ignore:

QUOTE: “Our algorithms will probably say well all of the links are links that the person place themselves so maybe we should kind of ignore those links.” John Mueller, Google 2020

Does Google Prefer Big Brands In Organic SERPs?

Well, yes. It’s hard to imagine that a system like Google’s was not designed exactly over the last few years to deliver the listings it does today – and it is often filled even with content that ranks high likely because of the domain the content is on.

Big brands also find it harder to take advantage of ‘domain authority’. Its harder for most businesses because low-quality content on parts of a domain can negatively impact the rankings of an entire domain.

QUOTE: “I mean it’s kind of like we look at your web site overall. And if there’s this big chunk of content here or this big chunk kind of important wise of your content, there that looks really iffy, then that kind of reflects across the overall picture of your website. ”  John Mueller, Google

Google has introduced (at least) a ‘percieved’ risk to publishing lots of lower-quality pages on your site to in an effort to curb production of keyword-stuffed  content based on manipulating early search engine algorithms.

We are dealing with new algorithms designed to target old style tactics and that focus around the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of KEYWORDS equals LOTS of Google traffic.

A big site can’t just get away with publishing LOTS of lower quality content in the cavalier way they used to – not without the ‘fear’ of primary content being impacted and organic search traffic throttled negatively to important pages on the site.

Google still uses links and something akin to the original PageRank:

QUOTE: “DYK that after 18 years we’re still using* PageRank (and 100s of other signals) in ranking?” Gary Illyes, Google 2017

Google is very probably also using user metrics in some way to determine the ‘quality’ of your site:

QUOTE: “I think that’s always an option. Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.” John Mueller, Google 

Your online reputation is evidently calculated by Google from many metrics.

Small businesses can still build this ‘domain reputation’ over time if you focus on a content strategy based on depth and quality, rather than breadth when it comes to how content is published on your website.

I did, on my own site.

Instead of publishing LOTS of pages, focus on fewer pages that are of high quality. You can better predict your success in ranking for the long term for a particular keyword phrase this way.

Then you rinse and repeat.

Failure to meet these standards for quality content may impact rankings noticeably around major Google quality updates.

Is Domain Age An Important Google Ranking Factor?

No, not in isolation.

Having a ten-year-old domain that Google knows nothing about is almost the same as having a brand new domain.

A 10-year-old site that’s continually cited by, year on year, the actions of other, more authoritative, and trusted sites? That’s valuable.

But that’s not the age of your website address ON ITS OWN in-play as a ranking factor.

A one-year-old domain cited by authority sites is just as valuable if not more valuable than a ten-year-old domain with no links and no search-performance history.

Perhaps Domain age may come into play when other factors are considered – but I think Google works very much like this on all levels, with all ‘Google ranking factors’, and all ranking ‘conditions’.

I don’t think you can consider discovering ‘ranking factors’ without ‘ranking conditions’.

Other Ranking Factors:

  1. Domain age; (NOT ON ITS OWN)
  2. Length of site domain registration; (I don’t see much benefit ON IT”S OWN even knowing “Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year.”) – paying for a domain in advance just tells others you don’t want anyone else using this domain name, it is not much of an indication that you’re going to do something Google cares about).
  3. Domain registration information was hidden/anonymous; (possibly, under human review if OTHER CONDITIONS are met like looking like a spam site)
  4. Site top level domain (geographical focus); (YES)
  5. Site top level domain (e.g. .com versus .info); (DEPENDS)
  6. Subdomain or root domain? (DEPENDS)
  7. Domain past records (how often it changed IP); (DEPENDS)
  8. Domain past owners (how often the owner was changed) (DEPENDS)
  9. Keywords in the domain; (DEFINITELYESPECIALLY EXACT KEYWORD MATCH – although Google has a lot of filters that mute the performance of an exact match domain))
  10. Domain IP; (DEPENDS – for most, no)
  11. Domain IP neighbours; (DEPENDS – for most, no)
  12. Domain external mentions (non-linked citations) (perhaps)
  13. Geo-targeting settings in Google Search Console (YES – of course)

Ranking Factors

QUOTE: “Rankings is a nuanced process and there is over 200 signals.” Maile Ohye, Google 2010

Google has HUNDREDS of ranking factors with signals that can change daily, weekly, monthly or yearly to help it work out where your page ranks in comparison to other competing pages in SERPs.

You will not ever find every ranking factor. Many ranking factors are on-page or on-site and others are off-page or off-site. Some ranking factors are based on where you are, or what you have searched for before.

QUOTE: “Google crawls the web and ranks pages. Where a page ranks in Google is down to how Google rates the page. There are hundreds of ranking signals….” Shaun Anderson, Hobo 2020

I’ve been in online marketing for over 20 years. In that time, a lot has changed. I’ve learned to focus on aspects that offer the greatest return on investment of your labour.

Learn The Basics

QUOTE: “While search engines and technology are always evolving, there are some underlying foundational elements that have remained unchanged from the earliest days….” Search Engine Journal 2020

Here are few simple tips to begin with:

If you are just starting out, don’t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it’s best to keep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out – you may as well learn how to do it within Google’s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google’s guidelines, or not, and stick to it. Don’t be caught in the middle with an important project. Do not always follow the herd.

If your aim is to deceive visitors from Google, in any way, Google is not your friend. Google is hardly your friend at any rate – but you don’t want it as your enemy. Google will send you lots of free traffic though if you manage to get to the top of search results, so perhaps they are not all that bad.

A lot of techniques that are in the short term effective at boosting a site’s position in Google are against Google’s guidelines. For example, many links that may have once promoted you to the top of Google, may, in fact, today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back. You must be smart, and cautious, when it comes to building links to your site in a manner that Google *hopefully* won’t have too much trouble with, in the FUTURE. Because they will punish you in the future.

Don’t expect to rank number 1 in any niche for a competitive keyword phrase without a lot of investment and work. Don’t expect results overnight. Expecting too much too fast might get you in trouble with the Google webspam team.

You don’t pay anything to get into Google, Yahoo or Bing natural, or free listings. It’s common for the major search engines to find your website pretty quickly by themselves within a few days. This is made so much easier if your cms actually ‘pings’ search engines when you update content (via XML sitemaps or RSS for instance).

To be listed and rank high in Google and other search engines, you really should consider and mostly abide by search engine rules and official guidelines for inclusion. With experience and a lot of observation, you can learn which rules can be bent, and which tactics are short-term and perhaps, should be avoided.

Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)

I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.

If you have original, quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you have original content of sufficient quality on your site, you can then let authority websites – those with online business authority – know about it, and they might link to you – this is called a quality backlink.

Search engines need to understand that ‘a link is a link’ that can be trusted. Links can be designed to be ignored by search engines with the rel nofollow attribute.

Search engines can also find your site by other websites linking to it. You can also submit your site to search engines directly, but I haven’t submitted any site to a search engine in the last ten years – you probably don’t need to do that. If you have a new site, I would immediately register it with Google Search Console these days.

Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.

Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.

Google WILL classify your site when it crawls and indexes your site – and this classification can have a DRASTIC effect on your rankings. It’s important for Google to work out WHAT YOUR ULTIMATE INTENT IS – do you want to be classified as a thin affiliate site made ‘just for Google’, a domain holding page or a small business website with a real purpose? Ensure you don’t confuse Google in any way by being explicit with all the signals you can – to show on your website you are a real business, and your INTENT is genuine – and even more important today – FOCUSED ON SATISFYING A VISITOR.

NOTE – If a page exists only to make money from Google’s free traffic – Google calls this spam. I go into this more, later in this guide.

The transparency you provide on your website in text and links about who you are, what you do, and how you’re rated on the web or as a business is one way that Google could use (algorithmically and manually) to ‘rate’ your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.

To rank for specific keyword phrase searches, you usually need to have the keyword phrase or highly relevant words on your page (not necessarily all together, but it helps) or in links pointing to your page/site.

Ultimately what you need to do to compete is largely dependent on what the competition for the term you are targeting is doing. You’ll need to at least mirror how hard they are competing if a better opportunity is hard to spot.

As a result of other quality sites linking to your site, the site now has a certain amount of real PageRank that is shared with all the internal pages that make up your website that will in future help provide a signal to where this page ranks in the future.

Yes, you need to attract links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Google Search Console if you sign up.

Link building is not JUST a numbers game, though. One link from a “trusted authority” site in Google could be all you need to rank high in your niche. Of course, the more “trusted” links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google.

Try and get links within page text pointing to your site with relevant, or at least, natural looking, keywords in the text link – not, for instance, in blogrolls or site-wide links. Try to ensure the links are not obviously “machine generated” e.g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits.

Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.

Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “keyword-phrase” as opposed to a naked url or “click here“. Beware, though that Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I do not optimise for grammatically incorrect terms these days (especially with links).

I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.

Search engines like Google ‘spider’ or ‘crawl’ your entire site by following all the links on your site to new pages, much as a human would click on the links to your pages. Google will crawl and index your pages, and within a few days usually, begin to return your pages in SERPs.

After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.

Ideally, you will have unique pages, with unique page titles and unique page meta descriptions . Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I don’t worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals.

Google will take some time to analyse your entire site, examining text content and links. This process is taking longer and longer these days but is ultimately determined by your domain reputation and real PageRank.

If you have a lot of duplicate low-quality text already found by Googlebot on other websites it knows about; Google will ignore your page. If your site or page has spammy signals, Google will penalise it, sooner or later. If you have lots of these pages on your site – Google will ignore most of your website.

You don’t need to keyword stuff your text to beat the competition.

You optimise a page for more traffic by increasing the frequency of the desired key phrase, related key terms, co-occurring keywords and synonyms in links, page titles and text content. There is no ideal amount of text – no magic keyword density. Keyword stuffing is a tricky business, too, these days.

I prefer to make sure I have as many UNIQUE relevant words on the page that make up as many relevant long tail queries as possible.

If you link out to irrelevant sites, Google may ignore the page, too – but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER – I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites, for example, don’t do well in Google these days without some good quality backlinks and higher quality pages.

Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out.

I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much.

Original content is king and will attract a “natural link growth” – in Google’s opinion. Too many incoming links too fast might devalue your site, but again. I usually err on the safe side – I always aimed for massive diversity in my links – to make them look ‘more natural’. Honestly, I go for natural links full stop, for this website.

Google can devalue whole sites, individual pages, template generated links and individual links if Google deems them “unnecessary” and a ‘poor user experience’.

Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.

Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank ‘thin’ pages in results – any page you want to rank – should have all the things Google is looking for. That’s a lot these days!

It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.

Consider linking to important pages on your site from your home page, and other important pages on your site.

Focus on RELEVANCE first. Then, focus your marketing efforts and get REPUTABLE. This is the key to ranking ‘legitimately’ in Google.

Google changes its algorithm to mix things up. Google Panda and Google Penguin are two such updates, but the important thing is to understand Google changes its algorithms constantly to control its listings pages (over 600 changes a year we are told).

The art of rank modification is to rank without tripping these algorithms or getting flagged by a human reviewer – and that is tricky!

Focus on improving website download speeds at all times. The web is changing very fast, and a fast website is a good user experience.

Search Engine Submission

Don’t get scammed:

QUOTE: “You don’t pay to get into any of the big search engines natural (free or organic) listings. Google has ways of submitting your web pages directly to their index. Most search engines do.” Shaun Anderson, Hobo 2020

On-Page; Technical SEO

If you are doing a professional audit for a real business, you are going to have to think like a Google Search Quality Rater AND a Google search engineer to provide real long-term value to a client.

When making a site for Google you really need to understand that Google has a long list of things it will mark sites down for, and that’s usually old-school tactics which are now classed as ‘webspam‘.

Conversely, sites that are not marked “low-quality” are not demoted and so will improve in rankings. Sites with higher rankings often pick up more organic links, and this process can float high-quality pages on your site quickly to the top of Google.

So the sensible thing for any webmaster is to NOT give Google ANY reason to DEMOTE a site. Tick all the boxes Google tell you to tick, so to speak.

On-Page; Unique Content

QUOTE: “Duplicated content is often not manipulative and is commonplace on the web and often free from malicious intent. It is not penalised, but it is not optimal. Copied content can often be penalised algorithmically or manually. Don’t be ‘spinning’ ‘copied’ text to make it unique! Shaun Anderson, Hobo 2020

On-Page; A Fast Site

QUOTE: “The mobile version of a website should ideally load in under 3 seconds and the faster the better. A VERY SLOW SITE can be a NEGATIVE Ranking factor. There is no set threshold or speed score to meet, just to make your page as fast as possible.” Shaun Anderson, Hobo 2020

On-Page; Mobile-Friendly, Responsive Web Pages

QUOTE: “There’s no best screen size to design for. Websites should transform responsively and fast on multiple resolutions on different browsers and platforms. Mobile-friendly and accessible. Design for your audience, first.” Shaun Anderson, Hobo 2020

On-Page; Canonical Link Element

QUOTE: “The canonical link element is extremely powerful and very important to include on your page. Every page on your site should have a canonical link element, even if it is self referencing. It’s an easy way to consolidate ranking signals from multiple versions of the same information. Note: Google will ignore misused canonicals given time.” Shaun Anderson, Hobo 2020

On-Page; Page Title Element

QUOTE: “To maximise usability across devices, stick to a shorter concise title of between 50-60 characters. Expect Google to count up to 12 words maximum in a title element. There should only be one page title element on a page.” Shaun Anderson, Hobo 2020

On-Page; Meta Keywords Tag

QUOTE: “In all the years I’ve been testing, Google ignores keywords in meta keywords. Google has also confirmed they ignore meta keywords. You can safely ignore optimising meta keywords across a site. I often remove them from a site, too.” Shaun Anderson, Hobo 2020

On-Page; Meta Description Tag

QUOTE: “Create a 50-word (300 character) summary of your page using short sentences. Add it to article and repeat it in the meta description. Avoid repetition. Be succinct. Do not keyword stuff. Make meta tags unique to the page. Review your SERP snippet. Programmatically generate meta descriptions on larger sites.” Shaun Anderson, Hobo 2020

On-Page; Robots Meta Tag

QUOTE: “There are various ways you can use a Robots Meta Tag but remember Google by default will index and follow links, so you have no need to include that as a command – you can leave the robots meta out completely – and probably should if you don’t know anything about it.” Shaun Anderson, Hobo 2020

On-Page; H1-H6: Page Headings

QUOTE: “Heading tags have influence when it comes to ranking in Google. Though it doesn’t matter to Google, I stick to one ‘H1’ on the page and use headings ‘H2’, ‘H3′ were appropriate. Keep headings in order. Avoid using headings for design elements. Write naturally with keywords in Headings if relevant. Avoid keyword stuffing.” Shaun Anderson, Hobo 2020

On-Page; Alt Tags & ALT Attribute Text

QUOTE: “Keep ALT Attribute text on web pages to between 80-125 characters maximum for accessibility purposes. Expect Google to count a maximum of 16 words (approximately 125+ characters) as part of your ALT text that will be a signal used to rank your page for a relevant query.” Shaun Anderson, Hobo 2020

On-Page; Keyword Rich, Search Engine Friendly URLs

QUOTE: “Keywords in URLS are a tiny ranking factor and user-friendly. Keep under 50 characters to display fully on desktop search. Long URLS are truncated. I use them on new sites. I avoid changing URLs on an otherwise perfectly functional website.” Shaun Anderson, Hobo 2020

On-Page; Absolute Or Relative URLs

QUOTE: “My advice would be to keep it consistent whatever you decide to use. Listen to your developers.” Shaun Anderson, Hobo 2020

On-Page; Keyword Density

QUOTE: There is no “best” keyword density. Write naturally and include the keyword phrase once or twice to rank in Google and avoid demotion. If you find you are repeating keyword phrases you are probably keyword stuffing your text.” Shaun Anderson, Hobo 2020

On-Page; Keyword Stuffing (Irrelevant Keywords)

QUOTE: “Keyword stuffing is simply the process of repeating the same keyword or key phrases over and over in a page. It’s counterproductive. It’s is a signpost of a very low-quality spam site and is something Google clearly recommends you avoid.” Shaun Anderson, Hobo 2020

On-Page; Link Title Attributes, Acronym & ABBR Tags

Does Google Count Text in The Acronym Tag?

From my tests, no. From observing how my test page ranks – Google is ignoring keywords in the acronym tag.

My observations from a test page I observe include;

  • Link Title Attribute – no benefit passed via the link either to another page, it seems. Potentially useful for image search.
  • ABBR (Abbreviation Tags) – No
  • Image File Name – No
  • Wrapping words (or at least numbers) in SCRIPT – Sometimes. Google is better at understanding what it can render.

It’s clear many invisible elements of a page are completely ignored by Google.

Some invisible items are (still) aparently supported:

  • NOFRAMES – Yes
  • NOSCRIPT – Yes
  • ALT Attribute – Yes

Unless you really have cause to focus on any particular invisible element, I think the **P** tag is the most important tag to optimise!

On-Page; Supplementary Content (SC)

I think this is incredibly important to note, even though it was de-emphasised in more recent versions of the quality rater guidelines:

QUOTE: “With this version we see some interesting changes.  Most noticeably is the de-emphasis of supplementary content, surprising since previous versions have stressed the importance of the additional supplementary content there is on the page – or the negative impact that content has.” Jennifer Slegg, 2016

An example of “supplementary” content is “navigation links that allow users to visit other parts of the website” and “footers” and “headers.”

Also consider your CTA (Call To Actions) on page and WHICH pages on your site you are sending visitors to from the target page. Remember this – low quality pages can negatively impact the rankings of other pages on your site.

These guidelines have been removed from the rater guidelines.

What is SC (supplementary content)?

QUOTE: “Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve it purpose. SC is controlled by webmasters and is an important part of the user experience. One common type of SC is navigation links that allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page. Sometimes the easiest way to identify SC is to look for the parts of the page that are not MC or Ads. ” Google Search Quality Evaluator Guidelines 2019

When it comes to a web page and positive UX, Google talks a lot about the functionality and utility of Helpful Supplementary Content – e.g. helpful navigation links for users (that are not, generally, MC or Ads).

Most of this advice is relevant to the desktop version of your site, and has actually been removed from recent quality rater guidelines but I think this is still worth noting, even with mobile first indexing.

QUOTE: “To summarize, a lack of helpful SC may be a reason for a Low quality rating, depending on the purpose of the page and the type of website. We have different standards for small websites which exist to serve their communities versus large websites with a large volume of webpages and content. For some types of “webpages,” such as PDFs and JPEG files, we expect no SC at all.” Google Search Quality Evaluator Guidelines 2015

It is worth remembering that Good supplementary content cannot save Poor main content from a low-quality page rating:

QUOTE: “Main Content is any part of the page that directly helps the page achieve its purpose“. Google Search Quality Evaluator Guidelines 2020

Good SC seems to certainly be a sensible option. It always has been.

Key Points about SC

  1. Supplementary Content can be a large part of what makes a High-quality page very satisfying for its purpose.
  2. Helpful SC is content that is specifically targeted to the content and purpose of the page.
  3. Smaller websites such as websites for local businesses and community organizations, or personal websites and blogs, may need less SC for their purpose.
  4. A page can still receive a High or even Highest rating with no SC at all.

Here are the specific quotes containing the term SC:

  1. Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose.
  2. SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.
  3. SC which contributes to a satisfying user experience on the page and website. – (A mark of a high-quality site – this statement was repeated 5 times)
  4. However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.
  5. However, some pages are deliberately designed to shift the user’s attention from the MC to the Ads, monetized links, or SC. In these cases, the MC becomes difficult to read or use, resulting in a poor user experience. These pages should be rated Low.
  6. Misleading or potentially deceptive design makes it hard to tell that there’s no answer, making this page a poor user experience.
  7. Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
  8. However, you may encounter pages with a large amount of spammed forum discussions or spammed user comments. We’ll consider a comment or forum discussion to be “spammed” if someone posts unrelated comments which are not intended to help other users, but rather to advertise a product or create a link to a website. Frequently these comments are posted by a “bot” rather than a real person. Spammed comments are easy to recognize. They may include Ads, download, or other links, or sometimes just short strings of text unrelated to the topic, such as “Good,” “Hello,” “I’m new here,” “How are you today,” etc. Webmasters should find and remove this content because it is a bad user experience.
  9. The modifications make it very difficult to read and are a poor user experience. (Lowest quality MC (copied content with little or no time, effort, expertise, manual curation, or added value for users))
  10. Sometimes, the MC of a landing page is helpful for the query, but the page happens to display porn ads or porn links outside the MC, which can be very distracting and potentially provide a poor user experience.
  11. The query and the helpfulness of the MC have to be balanced with the user experience of the page.

On-Page; Optimise Supplementary Content on the Page

This is important for desktop but mobile too. Once you have the content, you need to think about supplementary content and secondary links that help users on their journey of discovery.

That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic – you should be LINKING OUT to other helpful resources e.g. other websites.

A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.

  • TASK – On informational pages, LINK OUT to related information pages on other sites AND on other pages on your own website where RELEVANT
  • TASK – For e-commerce pages, ADD RELATED PRODUCTS.
  • TASK – Create In-depth Content Pieces with on-page navigation arrays to named anchors on the page
  • TASK – Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links
  • TASK – Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject
  • TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
  • TASK – Do not over-optimise the link-relationship between your YMYL pages and your INFO-type article content.

On-Page; Main Content (MC)

QUOTE: “(Main CONTENT) is (or should be!) the reason the page exists.” Google Search Quality Evaluator Guidelines 2019

What Is Google Focused On?

Google is concerned with the PURPOSE of a page, the MAIN CONTENT (MC) of a page, the SUPPLEMENTARY CONTENT of a page and HOW THAT PAGE IS monetised, and if that monetisation impacts the user experience of consuming the MAIN CONTENT.

Webmasters need to be careful when optimising a website for CONVERSION first if that gets in the way of a user’s consumption of the main content on the page.

Google also has a “Page Layout Algorithm” that demotes pages with a lot of advertising “above the fold” or that forces users to scroll past advertisements to get to the Main Content of the page.

High-quality supplementary content should “(contribute) to a satisfying user experience on the page and website.” and it should NOT interfere or distract from the MC.

Google says,“(Main CONTENT) is (or should be!) the reason the page exists.” so this is probably the most important part of the page, to Google.

On-Page; Tell Visitors When Content Was Published or Last Updated

QUOTE: “The date should be unambiguous and easy to understand for maximum accessibility, hence why I always include the month in words rather than only numeric values. If I did rely only on numbers for dates I could end up displaying modification dates as 11/11/11.” Shaun Anderson, Hobo 2020

Googlers may have recently indicated perhaps the best way to add dates to your documents:

QUOTE: “DD/MM/YYYY is hard to misunderstand and it’s easy to parse.“. Gary Illyes, Google 2020

On-Page; Copywriting

QUOTE: “The art of writing high-quality content for search engines on-page, in page titles, meta descriptions, SERP snippets and SERP featured snippets.” Shaun Anderson, Hobo 2020

On-Page; Keyword Research

QUOTE: “Adding one word to your page can make all the difference between ranking No1 in Google and not ranking at all. Customers won’t find you unless YOU optimise for the phrases THEY use.” Shaun Anderson, Hobo 2020

On-Page; Broken Links

QUOTE: “Broken links are a waste of link power and could hurt your site, drastically in some cases, if a poor user experience is identified by Google. Google is a link based search engine – if your links are broken, you are missing out on the benefit you would get if they were not broken.” Shaun Anderson, Hobo 2020

On-Page; Point Internal Links To Relevant Pages

QUOTE: “We do use internal links to better understand the context of content of your sites” John Mueller, Google 2015

I link to relevant internal pages in my site when necessary.

QUOTE: “Internal link building is the art age-old of getting pages crawled and indexed by Google. ” Shaun Anderson, Hobo 2020

I expect the full benefit of anchor text to flow from info-to-info-type articles, not info-to-YMYL pages.

QUOTE: “I’d forget everything you read about “link juice.” It’s very likely all obsolete, wrong, and/or misleading. Instead, build a website that works well for your users.” John Mueller, Google 2020

On-Page; Link Out To Related Sites

QUOTE: “I regularly link out to other quality relevant pages on other websites where possible and where a human would find it valuable.” Shaun Anderson, Hobo 2020

On-Page; Nofollow Links

QUOTE: “Google recommends that webmasters qualify outbound links so that these type of links do not artificially affect rankings. Simply: Use rel=”sponsored” or rel=”nofollow” for paid links. Use rel=”ugc” or rel=”nofollow” for user generated content links.” Shaun Anderson, Hobo 2020

On-Page; Website Accessibility

QUOTE: “It makes sense to create websites that are accessible to as many people as possible…. The W3c lay down the general guidelines for websites that most people adhere to.” Shaun Anderson, Hobo 2020

On-Page; Valid HTML & CSS

QUOTE: ” Well, a lot of websites can have invalid code but actually render just fine, because a lot of modern browsers do a good job dealing with bad code. And so, it’s not so much that the code has to be absolutely perfect, but whether or not the page is going to render well for the user in general. ” Danny Sullivan, Google 2011

Does Google rank a page higher because of valid code? The short answer is no.

QUOTE: “While there are benefits, Google doesn’t really care if your page is valid HTML and valid CSS. This is clear – check any top ten results in Google and you will probably see that most contain invalid HTML or CSS.” Shaun Anderson, Hobo 2020

On-Page; Which Is Better For Google? PHP, HTML or ASP?

Google doesn’t care. As long as it renders as a browser compatible document, it appears Google can read it these days. I prefer PHP these days even with flat documents as it is easier to add server-side code to that document if I want to add some sort of function to the site. Beware using client-side Javascript to render pages as often some content served using client-side JavaScript is not readable by Googlebot.

On-Page; Javascript

QUOTE: “The first important challenge to note about Javascript is that not every search engine treats JS the way Google does. The sensible thing is to keep things as simple as possible for maximum effectiveness.” Shaun Anderson, Hobo 2020

On-Page; Rich Snippets

Rich Snippets and Schema Markup can be intimidating if you are new to them – but important data about your business can actually be very simply added to your site:

QUOTE: “Google Search works hard to understand the content of a page. You can help us by providing explicit clues about the meaning of a page to Google by including structured data on the page. Structured data is a standardized format for providing information about a page and classifying the page content; for example, on a recipe page, what are the ingredients, the cooking time and temperature, the calories, and so on.” Google Developer Guides, 2020

On-Page; Check How Your Page Renders Using GSC Url Inspection Tool

QUOTE: “It is important that what Google (Googlebot) sees is (exactly) what a visitor would see if they visit your site. Blocking Google can sometimes result in a real ranking problem for websites.” ShaunAnderson, Hobo 2020

  • Pro Tip: The fetch and render Url Inspection Tool is also a great way to submit your new content to Google for indexing so that the page may quickly appear in Google search results.
  • Pro Tip: Check the actual cache results in Google SERPs to double-check how Google renders your web page.

On-Page; Check How Your Page Renders On Other Browsers

QUOTE: “Your website CAN’T look the same in ALL of these browsers, but if it looks poor in most of the popular browsers, then you might have a problem.” Shaun Anderson, Hobo 2020

On-Site; Redirect Non-WWW To WWW (or Vice Versa)

QUOTE: “This is a MUST HAVE best practice. Basically, you are redirecting all ranking signals to one canonical version of a URL. It keeps it simple when optimising for Google. Do not mix the two types of www/non-www on-site when linking your internal pages.” Shaun Anderson, Hobo 2020

On-Site; 301 Redirects

QUOTE: “301 Redirects are an incredibly important and often overlooked area of search engine optimisation….. You can use 301 redirects to redirect pages, sub-folders or even entire websites and preserve Google rankings that the old page, sub-folder or website(s) enjoyed. Rather than tell Google via a 404, 410 or some other instruction that a page isn’t here anymore, you can permanently redirect an expired page to a relatively similar page to preserve any link equity that old page might have.” Shaun Anderson, Hobo 2020

On-Site; Have A Useful 404 Page

QUOTE: “It is incredibly important to create useful and proper 404 pages. This will help prevent Google recording lots of autogenerated thin pages on your site (both a security risk and a rankings risk).” Shaun Anderson, Hobo 2020

On-Site; XML Sitemap

QUOTE: “An XML Sitemap is a file on your server with which you can help Google easily crawl & index all the pages on your site. This is evidently useful for very large sites that publish lots of new content or updates content regularly.” Shaun Anderson, Hobo 2020

On-Site; Robots.txt File

QUOTE: “A robots.txt file is a file on your webserver used to control bots like Googlebot, Google’s web crawler. You can use it to block Google and Bing from crawling parts of your site.” Shaun Anderson, Hobo 2020

On-Site; Website Indexation Challenges

QUOTE: “Make sure Google can crawl your website, index and rank all your primary pages by only serving Googlebot high-quality, user friendly and fast loading pages to index.” Shaun Anderson, Hobo 2020

On-Site; Enough Satisfying Website Information for the Purpose of the Website

QUOTE: “If a page has one of the following characteristics, the Low rating is usually appropriate…..(if) There is an unsatisfying amount of website information for the purpose of the website.” Google Quality Rater Guide, 2017

Google wants evaluators to find out who owns the website and who is responsible for the content on it.

Your website also needs to meet the legal requirements necessary to comply with laws in your country. It’s easy to just incorporate this required information into your footer.

QUOTE: “Companies…. must include certain regulatory information on their websites and in their email footers before 1st January 2007 or they will breach the Companies Act and risk a fine.” Outlaw, 2006

Here’s what you need to know regarding website and email footers to comply with the Companies Act ;

———————————-

  1. The Company Name –
    Physical geographic address (A PO Box is unlikely to suffice as a geographic address; but a registered office address would – If the business is a company, the registered office address must be included.)
  2. The company’s registration number should be given and, under the Companies Act, the place of registration should be stated
  3. Email address of the company (It is not sufficient to include a ‘contact us’ form without also providing an email address and geographic address somewhere easily accessible on the site)
  4. The name of the organisation with which the customer is contracting must be given. This might differ from the trading name. Any such difference should be explained
  5. If your business has a VAT number, it should be stated even if the website is not being used for e-commerce transactions.
  6. Prices on the website must be clear and unambiguous. Also, state whether prices are inclusive of tax and delivery costs.

———————————-

The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).

Note: If the business is a member of a trade or professional association, membership details, including any registration number, should be provided.

Consider also the Distance Selling Regulations which contain other information requirements for online businesses that sell to consumers (B2C, as opposed to B2B, sales).

On-Page; Dynamic PHP Copyright Notice

QUOTE: “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” Google Quality Rater Guidelines

If you run a legitimate online business you’ll want to ensure your website never looks obviously out of date. A dynamic copyright notice is an option.

While you are editing your footer – ensure your copyright notice is dynamic and will change year to year – automatically.

It’s simple to display a dynamic date in your footer in WordPress, for instance, so you never need to change your copyright notice on your blog when the year changes.

This little bit of code will display the current year. Just add it in your theme’s footer.php and you can forget about making sure you don’t look stupid, or give the impression your site is out of date and unused, at the beginning of every year.

&copy; Copyright 2000 - <?php echo date("Y") ?>

A simple and elegant PHP copyright notice for WordPress blogs.

On-Page; Blogging

QUOTE: “I used it to power visits and sales for my company a bit before it was commonplace. Google (I include Google Feedburner in this) has sent this site almost 8 million unpaid organic visitors since I started blogging. There’s no single way to blog but here’s what I learned and how I did it if you are totally new to blogging.” Shaun Anderson, Hobo 2020

On-Page; How To Manage UGC (“User-Generated Content”)

QUOTE – “So it’s not that our systems will look at your site and say oh this was submitted by a user therefore the site owner has like no no control over what’s happening here but rather we look at it and say well this is your website this is what you want to have indexed you kind of stand for the content that you’re providing there so if you’re providing like low quality user-generated content for indexing then we’ll think well this website is about low quality content and spelling errors don’t necessarily mean that it’s low quality but obviously like it can go in all kinds of weird directions with user-generated content…” John Mueller, Google 2019

It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your page and your site.

In a 2015 hangout John Mueller said to “noindex untrusted post content” and going on says  posts by new posters who haven’t been in the forum before. threads that don’t have any answers. Maybe they’re noindexed by default.

A very interesting statement was “how much quality content do you have compared to low-quality content“. That indicates google is looking at this ratio. John says to identify “which pages are high-quality, which pages are lower quality so that the pages that do get indexed are really the high-quality ones.

John mentions looking too at “threads that don’t have any authoritative answers“.

I think that advice is relevant for any site with lots of UGC.

Google wants you to moderate any user generated content you publish, and will rate the page on it. Moderate comments for quality and apply rel=”nofollow” or rel=”ugc” to links in comments.

On-Page; User-Generated Content Is Rated As Part of the Page in Quality Scoring

QUOTE: “when we look at a page, overall, or when a user looks at a page, we see these comments as part of the content design as well. So it is something that kind of all combines to something that users see, that our algorithms see overall as part of the content that you’re publishing.” John Mueller, Google

Bear in mind that:

QUOTE: “As a publisher, you are responsible for ensuring that all user-generated content on your site or app complies with all applicable programme policies.” Google, 2018

Google want user-generated content on your site to be moderated and kept as high quality as the rest of your site.

They explain:

QUOTE: “Since spammy user-generated content can pollute Google search results, we recommend you actively monitor and remove this type of spam from your site.” Google, 2018

and

QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”

and

QUOTE: “When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google

On-Page; Moderate Comments

Some examples of spammy user-generated content include:

QUOTE: “Comment spam on blogs” Google, 2018

User-generated content (for instance blog comments) are counted as part of the page and these comments are taken into consideration when Google rates the page.

QUOTE: “For this reason there are many ways of securing your application and dis-incentivising spammers. 

  • Disallow anonymous posting.
  • Use CAPTCHAs and other methods to prevent automated comment spamming.
  • Turn on comment moderation.
  • Use the “nofollow” attribute for links in the comment field.
  • Disallow hyperlinks in comments.
  • Block comment pages using robots.txt or meta tags.” Google

On-Page; Moderate Forums

QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google

If you have a forum plugin on your site, moderate:

QUOTE: “Spammy posts on forum threads” Google, 2018

It’s evident that Google wants forum administrators to work harder on managing user-generated content Googlebot ‘rates’ as part of your site.

In a 2015 hangout John Mueller said to “noindex untrusted post content” and going on says  posts by new posters who haven’t been in the forum before. threads that don’t have any answers. Maybe they’re noindexed by default.

A very interesting statement was “how much quality content do you have compared to low-quality content“. That indicates Google is looking at this ratio. John says to identify “which pages are high-quality, which pages are lower quality so that the pages that do get indexed are really the high-quality ones.

John mentions looking at “threads that don’t have any authoritative answers“.

I think that advice is relevant for any site with lots of content.

On-Page; Moderate ANY User Generated Content

You are responsible for what you publish.

No matter how you let others post something on your website, you must ensure a high standard:

QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules). When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google 2016

and

QUOTE: “… it also includes things like the comments, includes the things like the unique and original content that you’re putting out on your site that is being added through user-generated content, all of that as well. So while I don’t really know exactly what our algorithms are looking at specifically with regards to your website, it’s something where sometimes you go through the articles and say well there is some useful information in this article that you’re sharing here, but there’s just lots of other stuff happening on the bottom of these blog posts. When our algorithms look at these pages, in an aggregated way across the whole page, then that’s something where they might say well, this is a lot of content that is unique to this page, but it’s not really high quality content that we want to promote in a very visible way. That’s something where I could imagine that maybe there’s something you could do, otherwise it’s really tricky I guess to look at specific changes you can do when it comes to our quality algorithms.” John Mueller, Google 2016

and

QUOTE: “Well, I think you need to look at the pages in an overall way, you should look at the pages and say, actually we see this a lot in the forums for example, people will say “my text is unique, you can copy and paste it and it’s unique to my website.” But that doesn’t make this website page a high quality page. So things like the overall design, how it comes across, how it looks like an authority, this information that is in general to webpage, to website, that’s things that all come together. But also things like comments where webmasters might say “this is user generated content, I’m not responsible for what people are posting on my website,”  John Mueller, Google 2016

and

QUOTE: “If you have comments on your site, and you just let them run wild, you don’t moderate them, they’re filled with spammers or with people who are kind of just abusing each other for no good reason, then that’s something that might kind of pull down the overall quality of your website where users when they go to those pages might say, well, there’s some good content on top here, but this whole bottom part of the page, this is really trash. I don’t want to be involved with the website that actively encourages this kind of behavior or that actively promotes this kind of content. And that’s something where we might see that on a site level, as well.” John Mueller, Google 2016

and

QUOTE: “When our quality algorithms go to your website, and they see that there’s some good content here on this page, but there’s some really bad or kind of low quality content on the bottom part of the page, then we kind of have to make a judgment call on these pages themselves and say, well, some good, some bad. Is this overwhelmingly bad? Is this overwhelmingly good? Where do we draw the line?” John Mueller, Google 2016

There’s a key insight for many webmasters managing UGC.

Watch out for those who want to use your blog for financial purposes both and terms of adding content to the site and linking to other sites.

QUOTE: “Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.” John Mueller, Google 2013

On-Page; You Must Moderate User-Generated Content If You Display Google Adsense

QUOTE: “Consider where user-generated content might appear on your site or app, and what risks to your site or app’s reputation might occur from malicious user-generated content. Ensure that you mitigate those risks before enabling user-generated content to appear.Set aside some time to regularly review your top pages with user-generated content. Make sure that what you see complies with all our programme policies.” Google Adsense Policies, 2018

and

QUOTE: “If a post hasn’t been reviewed yet and approved, allow it to appear, but disable ad serving on that page. Only enable ad serving when you’re sure that a post complies with our programme policies.” Google Adsense Policies, 2018

and

QUOTE: “And we do that across the whole website to kind of figure out where we see the quality of this website. And that’s something that could definitely be affecting your website overall in the search results. So if you really work to make sure that these comments are really high quality content, that they bring value, engagement into your pages, then that’s fantastic. That’s something that I think you should definitely make it so that search engines can pick that up on.”  John Mueller, Google 2016

You can also get a Google manual action penalty for user-generated spam on your site, which can affect parts of the site or the whole site.

QUOTE: “Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles. As a result, Google has applied a manual spam action to your site.” Google Notice of Manual Action

Off-Site; Unnatural Links

QUOTE: “There are 2 main types of unnatural links. Links you made yourself, and the links you didn’t. If you didn’t make any of the links then you probably don’t need to worry about manual actions for unnatural links, and you very probably don’t need to ever worry about the disavow tool. If you have a manual action for “a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages” then you will very probably need to use the disavow tool to clean up the links you commissioned or made yourself.” Shaun Anderson, Hobo 2020

Off-Site; Link Building

QUOTE: “Link building is the process of earning links on other websites. Earned natural links directly improve the reputation of a website and where it ranks in Google. Self-made links are unnatural.  Google penalises unnatural links.” Shaun Anderson, Hobo 2020

Off-Site; Social Media

QUOTE: “While social media links themselves hold little value to Google because they are ‘nofollowed’, the buzz made possible by social media can lead to organic links and other positive ranking signals …” Shaun Anderson, Hobo 2020

What To Avoid

Google has a VERY basic organic search engine optimisation starter guide pdf for webmasters, which they use internally:

QUOTE: “Although this guide won’t tell you any secrets that’ll automatically rank your site first for queries in Google (sorry!), following the best practices outlined below will make it easier for search engines to both crawl and index your content.” Google, 2008

It is still worth a read, even if it is VERY basic, best practices for your site.

No search engine will EVER tell you what actual keywords to put on your site to improve individual rankings or get more converting organic traffic – and in Google – that’s the SINGLE MOST IMPORTANT thing you want to know!

Google specifically mentions something very important in the guide:

QUOTE: “Creating compelling and useful content will likely influence your website more than any of the other factors discussed here.” Google SEO Starter Guide, 2017

This starter guide is still very useful for beginners.

I do not think there is anything in the 2017 guide that is really useful if you have been working online since the last starter guide was first published (2008) and its first update was announced (2010). It still leaves out some of the more complicated technical recommendations for larger sites.

I usually find it useful to keep an eye on what Google tells you to avoid in such documents, which are:

  1. AVOID: “Don’t let your internal search result pages be crawled by Google. Users dislike clicking a search engine result only to land on another search result page on your site.”

  2. AVOID: “Allowing URLs created as a result of proxy services to be crawled.”

  3. AVOID: “Choosing a title that has no relation to the content on the page.”

  4. AVOID: “Using default or vague titles like “Untitled” or “New Page 1″.”

  5. AVOID: “Using a single title across all of your site’s pages or a large group of pages.”

  6. AVOID: “Using extremely lengthy titles that are unhelpful to users.”

  7. AVOID: “Stuffing unneeded keywords in your title tags.”

  8. AVOID: “Writing a description meta tag that has no relation to the content on the page.”

  9. AVOID: “Using generic descriptions like “This is a web page” or “Page about baseball cards”.”

  10. AVOID: “Filling the description with only keywords.”

  11. AVOID: “Copying and pasting the entire content of the document into the description meta tag.”

  12. AVOID: “Using a single description meta tag across all of your site’s pages or a large group of pages.”

  13. AVOID: “Placing text in heading tags that wouldn’t be helpful in defining the structure of the page.”

  14. AVOID: “Using heading tags where other tags like <em> and <strong> may be more appropriate.”

  15. AVOID: “Erratically moving from one heading tag size to another.”

  16. AVOID: “Excessive use of heading tags on a page.”

  17. AVOID: “Very long headings.”

  18. AVOID: “Using heading tags only for styling text and not presenting structure.”

  19. AVOID: “Using invalid ‘Structured Data ‘markup.”

  20. AVOID: “Changing the source code of your site when you are unsure about implementing markup.”

  21. AVOID: “Adding markup data which is not visible to users.”

  22. AVOID: “Creating fake reviews or adding irrelevant markups.”

  23. AVOID: “Creating complex webs of navigation links, for example, linking every page on your site to every other page.”

  24. AVOID: “Going overboard with slicing and dicing your content (so that it takes twenty clicks to reach from the homepage).”

  25. AVOID: “Having a navigation based entirely on images, or animations.”

  26. AVOID: “Requiring script or plugin-based event-handling for navigation”

  27. AVOID: “Letting your navigational page become out of date with broken links.”

  28. AVOID: “Creating a navigational page that simply lists pages without organizing them, for example by subject.”

  29. AVOID: “Allowing your 404 pages to be indexed in search engines (make sure that your web server is configured to give a 404 HTTP status code or – in the case of JavaScript-based sites – include a noindex robots meta-tag when non-existent pages are requested).”

  30. AVOID: “Blocking 404 pages from being crawled through the robots.txt file.”

  31. AVOID: “Providing only a vague message like “Not found”, “404”, or no 404 page at all.”

  32. AVOID: “Using a design for your 404 pages that isn’t consistent with the rest of your site.”

  33. AVOID: “Using lengthy URLs with unnecessary parameters and session IDs.”

  34. AVOID: “Choosing generic page names like “page1.html”.”

  35. AVOID: “Using excessive keywords like “baseball-cards-baseball-cards-baseballcards.htm”.”

  36. AVOID: “Having deep nesting of subdirectories like “…/dir1/dir2/dir3/dir4/dir5/dir6/page.html”.”

  37. AVOID: “Using directory names that have no relation to the content in them.”

  38. AVOID: “Having pages from subdomains and the root directory access the same content, for example, “domain.com/page.html” and “sub.domain.com/page.html”.”

  39. AVOID: “Writing sloppy text with many spelling and grammatical mistakes.”

  40. AVOID: “Awkward or poorly written content.”

  41. AVOID: “Embedding text in images and videos for textual content: users may want to copy and paste the text and search engines can’t read it.”

  42. AVOID: “Dumping large amounts of text on varying topics onto a page without paragraph, subheading, or layout separation.”

  43. AVOID: “Rehashing (or even copying) existing content that will bring little extra value to users.”

  44. AVOID: “Having duplicate or near-duplicate versions of your content across your site.”

  45. AVOID: “Inserting numerous unnecessary keywords aimed at search engines but are annoying or nonsensical to users.”

  46. AVOID: “Having blocks of text like “frequent misspellings used to reach this page” that add little value for users.”

  47. AVOID: “Deceptively hiding text from users, but displaying it to search engines.”

  48. AVOID: “Writing generic anchor text like “page”, “article”, or “click here”.”

  49. AVOID: “Using text that is off-topic or has no relation to the content of the page linked to.”

  50. AVOID: “Using the page’s URL as the anchor text in most cases, although there are certainly legitimate uses of this, such as promoting or referencing a new website’s address.”

  51. AVOID: “Writing long anchor text, such as a lengthy sentence or short paragraph of text.”

  52. AVOID: “Using CSS or text styling that make links look just like regular text.”

  53. AVOID: “Using excessively keyword-filled or lengthy anchor text just for search engines.”

  54. AVOID: “Creating unnecessary links that don’t help with the user’s navigation of the site.”

  55. AVOID: “Using generic filenames like “image1.jpg”, “pic.gif”, “1.jpg” when possible—if your site has thousands of images you might want to consider automating the naming of the images.”

  56. AVOID: “Writing extremely lengthy filenames.”

  57. AVOID: “Stuffing keywords into alt text or copying and pasting entire sentences.”

  58. AVOID: “Writing excessively long alt text that would be considered spammy.”

  59. AVOID: “Using only image links for your site’s navigation”.

  60. AVOID: “Attempting to promote each new, small piece of content you create; go for big, interesting items.”

  61. AVOID: “Involving your site in schemes where your content is artificially promoted to the top of these services.”

  62. AVOID: “Spamming link requests out to all sites related to your topic area.”

  63. AVOID: “Purchasing links from another site with the aim of getting PageRank”

This is straightforward stuff but sometimes it’s the simple stuff that often gets overlooked. Of course, you combine the above together with the technical recommendations in Google guidelines for webmasters.

Don’t make these simple but dangerous mistakes…..

  1. Avoid duplicating content on your site found on other sites. Yes, Google likes content, but it *usually* needs to be well linked to, unique and original to get you to the top!
  2. Don’t hide text on your website. Google may eventually remove you from the SERPs.
  3. Don’t buy 1000 links and think “that will get me to the top!”. Google likes natural link growth and often frowns on mass link buying.
  4. Don’t get everybody to link to you using the same “anchor text” or link phrase. This could flag you as a ‘rank modifier’. You don’t want that.
  5. Don’t chase Google PR by chasing 100′s of links. Think quality of links….not quantity.
  6. Don’t buy many keyword rich domains, fill them with similar content and link them to your site. This is lazy and dangerous and could see you ignored or worse banned from Google. It might have worked yesterday but it sure does not work today without some grief from Google.
  7. Do not constantly change your site pages names or site navigation without remembering to employ redirects. This just screws you up in any search engine.
  8. Do not build a site with a JavaScript navigation that Google, Yahoo and Bing cannot crawl.
  9. Do not link to everybody who asks you for reciprocal links. Only link out to quality sites you feel can be trusted.
  10. Do not spam visitors with ads and pop-ups.

Don’t Flag Your Site With Spam

A primary goal of any ‘rank modification’ is not to flag your site as ‘suspicious’ to Google’s algorithms or their webspam team.

I would recommend you forget about tricks like links in H1 tags etc. or linking to the same page 3 times with different anchor text on one page.

Forget about ‘which is best’ when considering things you shouldn’t be wasting your time with.

Every element on a page is a benefit to you until you spam it.

Put a keyword in every tag and you may flag your site as ‘trying too hard’ if you haven’t got the link trust to cut it – and Google’s algorithms will go to work.

Spamming Google is often counter-productive over the long term.

So:

  • Don’t spam your anchor text link titles with the same keyword.
  • Don’t spam your ALT Tags or any other tags either.
  • Add your keywords intelligently.
  • Try and make the site mostly for humans, not just search engines.

On Page is not as simple as a checklist any more of keyword here, keyword there. We are up against lots of smart folk at the Googleplex – and they purposely make this practice difficult.

For those who need a checklist, this is the sort of one that gets me results;

  1. Do keyword research
  2. Identify valuable searcher intent opportunities
  3. Identify the audience & the reason for your page
  4. Write utilitarian copy – be useful. Use related terms in your content. Use plurals and synonyms. Use words with searcher intent like “buy”, “compare”, “hire” etc. I prefer to get a keyword or related term in almost every paragraph.
  5. Use emphasis sparingly to emphasise the important points in the page whether they are your keywords are not
  6. Pick an intelligent Page Title with your keyword in it
  7. Write an intelligent meta description, repeating it on the page
  8. Add an image with user-centric ALT attribute text
  9. Link to related pages on your site within the text
  10. Link to related pages on other sites
  11. Your page should have a simple search engine friendly URL
  12. Keep it simple
  13. Don’t let ads get in the way
  14. Share it and pimp it

You can forget about just about everything else.

How Google Treats Subdomains: “We… treat that more as a single website”

John Mueller was asked if it is ok to interlink sub-domains, and the answer is yes, usually, because Google will treat subdomains on a website (primarily I think we can presume) as “a single website“:

QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017

That is important if Google has a website quality rating score (or multiple scores) for websites.

Personally I prefer subdirectories rather than subdomains if given the choice, unless it really makes sense to house particular content on a subdomain, rather than the main site (as in the examples John mentions).

Back in the day when Google Panda was released, many tried to run from Google’s content quality algorithms by moving to ‘penalised’ pages and content to subfolders. I thought that was a temporary solution.

Over the long term, you can, I think, expect Google to treat subdomains on most common use websites as one entity – if it is – and be ranked accordingly in terms of content quality and user satisfaction.

Should I Choose a Subfolder or Subdomain?

If you have the choice, I would choose to house content on a subfolder on the main domain. Recent research would still indicate this is the best way to go:

QUOTE: “When you move from Subdomain to Subdirectory, you rank much better, and you get more organic traffic from Google.” Sistrix, 2018

Subfolders or Subdomains for google seo?: Observations indicate subfolders.

A Continual Evolution

QUOTE: “One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before. The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them.” Danny Sullivan, Google 2020

The ‘Keyword Not Provided‘ incident (2011), the Google Penguin Update (2012), and the Google Panda Update (2011) are just some examples of Google making ranking in organic listings HARDER – a change for ‘users’ that seems to have the most impact on ‘marketers’ outside of Google’s ecosystem.

A more recent example of ever-changing search would be the mobile-first index and the Core Web Vitals announcement:

QUOTE: “We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.” Sowmya Subramanian, Google 2020

Now, we need to be topic-focused (abstract, I know), instead of just keyword focused when optimising a web page for Google. There are now plenty of third-party tools that help when researching keywords but most of us miss the kind of keyword intelligence we used to have access to.

The first step:

QUOTE: “is really to determine what it is you’re actually optimising for. This means identifying the terms people are searching for (also known as “keywords”) that you want your website to rank for in search engines like Google.” Wordstream, 2015

Proper keyword research is important because getting a site to the top of Google eventually comes down to your text content on a page and keywords in external & internal links. Altogether, Google uses these signals to determine where you rank if you rank at all.

There’s no magic bullet, to this.

QUOTE: “I don’t think there’s one magic trick that that I can offer you that will make sure that your website stays relevant in the ever-changing world of the web so that’s something where you’ll kind of have to monitor that on your side and work out what makes sense for your site or your users or your business.” John Mueller, Google 2019

At any one time, your site is probably feeling the influence of some algorithmic filter (for example, Google Panda or Google Penguin) designed to keep spam sites under control and deliver relevant, high-quality results to human visitors.

QUOTE: “Here’s how it works: Google (or any search engine you’re using) has a crawler that goes out and gathers information about all the content they can find on the Internet. The crawlers bring all those 1s and 0s back to the search engine to build an index. That index is then fed through an algorithm that tries to match all that data with your query.” Moz, 2020

One filter may be kicking in keeping a page down in the SERPs while another filter is pushing another page up. You might have poor content but excellent incoming links, or vice versa. You might have very good content, but a very poor technical organisation of it. You might have too many ads in the wrong places.

You must identify the reasons Google doesn’t ‘rate’ a particular page higher than competing pages

The answer is either on the page, on the site or in backlinks pointing to the page.

Ask yourself:

  • Do you have too few quality inbound links?
  • Do you have too many low-quality backlinks?
  • Does your page lack descriptive keyword rich text?
  • Are you keyword stuffing your text?
  • Do you link out to unrelated sites?
  • Do you have too many advertisements above the fold?
  • Do you have affiliate links on every page of your site and text found on other websites?
  • Do you have broken links and missing images on the page?
  • Does your page meet quality guidelines, legal and advertising stadards?
  • Does ads interrupt the enjoyment of your main content?

Whatever they are, identify issues and fix them.

Get on the wrong side of Google and your site might well be selected for MANUAL review – so optimise your site as if, one day, you will get that website review from a Google Web Spam reviewer.

QUOTE: “Put useful content on your page and keep it up to date.” Google Webmaster Guidelines, 2020

The key to a successful campaign, I think, is persuading Google that your page is most relevant to any given search query. You do this by good unique keyword rich text content and getting “quality” links to that page.

The latter is far easier to say these days than actually do!

Next time you are developing a page, consider what looks spammy to you is probably spammy to Google. Ask yourself which pages on your site are really necessary. Which links are necessary? Which pages on the site are emphasised in the site architecture? Which pages would you ignore?

You can help a site along in any number of ways (including making sure your page titles and meta tags are unique) but be careful. Obvious evidence of ‘rank modifying’ is dangerous.

I prefer simple techniques and ones that can be measured in some way. I have never just wanted to rank for competitive terms; I have always wanted to understand at least some of the reasons why a page ranked for these key phrases. I try to create a good user experience for humans AND search engines. If you make high-quality text content relevant and suitable for both these audiences, you’ll more than likely find success in organic listings and you might not ever need to get into the technical side of things, like redirects and search engine friendly URLs.

To beat the competition in an industry where it’s difficult to attract quality links, you have to get more “technical” sometimes – and in some industries – you’ve traditionally needed to be 100% black hat to even get in the top 100 results of competitive, transactional searches.

There are no hard and fast rules to long-term ranking success, other than developing quality websites with quality content and quality links pointing to it. The less domain authority you have, the more text you’re going to need. The aim is to build a satisfying website and build real authority!

You need to mix it up and learn from experience. Make mistakes and learn from them by observation. I’ve found getting penalised is a very good way to learn what not to do.

Remember there are exceptions to nearly every rule, and in an ever-fluctuating landscape, and you probably have little chance determining exactly why you rank in search engines these days. I’ve been doing it for over 15 years and every day I’m trying to better understand Google, to learn more and learn from others’ experiences.

It’s important not to obsess about granular ranking specifics that have little return on your investment unless you really have the time to do so! THERE IS USUALLY SOMETHING MORE VALUABLE TO SPEND THAT TIME ON.

That’s usually either good backlinks or great content.

QUOTE: “So there’s three things that you really want to do well if you want to be the world’s best search engine you want to crawl the web comprehensively and deeply you want to index those pages and then you want to rank or serve those pages and return the most relevant ones first….. we basically take PageRank as the primary determinant and the more PageRank you have that is the more people who link to you and the more reputable those people are the more likely it is we’re going to discover your page…. we use page rank as well as over 200 other factors in our rankings to try to say okay maybe this document is really authoritative it has a lot of reputation because it has a lot of PageRank … and that’s kind of the secret sauce trying to figure out a way to combine those 200 different ranking signals in order to find the most relevant document.” Matt Cutts, Google 2012

The fundamentals have not changed much over the years.

Google isn’t lying about rewarding legitimate effort – despite what some claim. If they were, I would be a black hat full time. So would everybody else trying to rank in Google.

It is much more complicated in some niches today than it was ten years ago.

The majority of small to medium businesses do not need advanced strategies because their direct competition has not employed these tactics either.

You are better off doing simple stuff better and faster than worrying about some of the more ‘advanced’ techniques you read on some blogs I think – it’s more productive, cost-effective for businesses and safer, for most.

Beware Pseudoscience In The Industry 

QUOTE: “Pseudoscience consists of statements, beliefs, or practices that are claimed to be both scientific and factual but are incompatible with the scientific method….” Wikipedia, 2020

Beware folk trying to bamboozle you with science.

This isn’t a science when Google controls the ‘laws’ and changes them at will.

QUOTE: “They follow the forms you gather data you do so and so and so forth but they don’t get any laws they don’t haven’t found out anything they haven’t got anywhere yet maybe someday they will but it’s not very well developed but what happens is an even more mundane level we get experts on everything that sound like this sort of scientific expert they they’re not scientist is a typewriter and they make up something.”  Richard Feynman, Physicist 1981

I get results by:

  • improving page experience signals
  • analysing Google rankings
  • performing Keyword research
  • making observations about ranking performance of your pages and that of others (though not in a controlled environment)
  • placing relevant, co-occurring words you want to rank for on pages
  • using synonyms
  • using words in anchor text in links on relevant pages and pointing them at relevant pages you want to rank high for the keyword in the anchor text
  • understanding what features in your title tag is what that page is going to rank best for
  • getting high-quality links from other trustworthy websites
  • publishing lots and lots of new, higher-quality content
  • focusing on the long tail of keyword searches
  • understanding it will take time to beat all this competition

I always expected to get a site demoted, by:

  • getting too many links with the same anchor text pointing to a page
  • keyword stuffing a page
  • trying to manipulate Google too much on a site
  • creating a “frustrating user experience.”
  • chasing the algorithm too much
  • getting links I shouldn’t have
  • buying links

Not that any of the above is automatically penalised all the time.

I was always of the mind I don’t need to understand the maths or science of Google, that much, to understand what Google engineers want.

The biggest challenge these days are to get trusted sites to link to you, but the rewards are worth it.

To do it, you probably should be investing in some marketable content, or compelling benefits for the linking party (that’s not just paying for links somebody else can pay more for). Buying links to improve rankings WORKS but it is probably THE most hated link building technique as far as the Google web spam team is concerned.

I was very curious about the science and  I studied what I could but it left me a little unsatisfied. I learned that building links, creating lots of decent content and learning how to monetise that content better (while not breaking any major TOS of Google) would have been a more worthwhile use of my time.

Getting better and faster at doing all that would be nice too.

There are many problems with blogs, too, including mine.

Misinformation is an obvious one. Rarely are your results conclusive or observations 100% accurate. Even if you think a theory holds water on some level. I try to update old posts with new information if I think the page is only valuable with accurate data.

Just remember most of what you read about how Google works from a third party is OPINION and just like in every other sphere of knowledge, ‘facts’ can change with a greater understanding over time or with a different perspective.

QUOTE: “Myths to be careful about..: 1. The NLP used by Google is not Neuro-linguistic Processing. 2. Semantic Search at Google is not powered by Latent Semantic Indexing 3. You cannot optimise pages for Hummingbird, RankBrain, or BERT.” Bill Slawski, 2019

Don’t Chase The Algorithm

QUOTE: “Google’s algorithm is constantly being improved; rather than trying to guess the algorithm and design your page for that, work on creating good, fresh content that users want, and following our guidelines.” Google Webmaster Guidelines, 2020

There is no magic bullet and there are no secret formulas to achieve fast number 1 ranking in Google in any competitive niche WITHOUT spamming Google.

A legitimately earned high position in search engines takes a lot of hard work.

There are a few less talked about tricks and tactics that are deployed by some (better than others) to combat algorithm changes, for instance, but there are no big secrets.

There are clever strategies, though, and creative solutions to be found to exploit opportunities uncovered by researching the niche.

Note that when Google recognises a new strategy that gets results the strategy itself usually becomes ‘webspam‘ and something you can be penalised for that so I would beware jumping on the latest fad.

The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn’t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. It is is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick.

After over a decade practising and deploying real campaigns, I’m still trying to get it down to its simplest, most cost-effective processes.

I think it’s about doing simple stuff right.

Good text, simple navigation structure, quality links. To be relevant and reputable takes time, effort and luck, just like anything else in the real world, and that is the way Google want it.

If a company is promising you guaranteed rankings and has a magic bullet strategy, watch out.

I’d check it didn’t contravene Google’s guidelines.

Keep It Simple, Stupid

Don’t Build Your Site With Flash, HTML Frames or any other deprecated technology.  Open web standards is the way forward.

QUOTE: “Don’t use frames to design your website” Shaun Anderson, Hobo 2020

Flash is a propriety plug-in created by Macromedia/Adobe to infuse (albeit) fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site. Instead, build your site with CSS and HTML ensuring everyone, including search engine robots, can sample your website content. Then, if required, you can embed media files such as Flash in the HTML of your website.

QUOTE: “Chrome will continue phasing out Flash over the next few years, first by asking for your permission to run Flash in more situations, and eventually disabling it by default. We will remove Flash completely from Chrome toward the end of 2020.” Google Chrome, 2017

Flash, in the hands of an inexperienced designer, can cause all types of problems at the moment, especially with:

  • Accessibility
  • Search Engines
  • Users not having the Plug-In
  • Large Download Times

Flash doesn’t even work at all on some devices, like the Apple iPhone. Note that Google sometimes highlights if your site is not mobile friendly on some devices. And on the subject of mobile-friendly websites – note that Google has alerted the webmaster community that mobile friendliness will be a search engine ranking factor.

Html5 is the preferred option over Flash these days, for most designers. A site built entirely in Flash will cause an unsatisfactory user experience and will affect your rankings and especially in mobile search results. For similar accessibility and user satisfaction reasons, I would also say don’t build a site with website frames.

As in any form of design, don’t try and re-invent the wheel when simple solutions suffice. The KISS philosophy has been around since the dawn of design.

KISS does not mean boring web pages. You can create stunning sites with smashing graphics – but you should build these sites using simple techniques – HTML & CSS, for instance. If you are new to web design, avoid things like Flash and JavaScript, especially for elements like scrolling news tickers, etc. These elements work fine for TV – but only cause problems for website visitors.

Keep layouts and navigation arrays consistent and simple too. Don’t spend time, effort and money (especially if you work in a professional environment) designing fancy navigation menus if, for example, your new website is an information site.

Same with website optimisation – keep your documents well structured and keep your page Title Elements and text content relevant, use Headings tags sensibly and try and avoid leaving too much of a footprint – whatever you are up to.

How Long Does It Take To See Results from SEO?

We have been told it:

QUOTE: “will need four months to a year to help your business first implement improvements and then see potential benefit.” Maile Ohye, Google 2017

Some results can be gained within weeks and you need to expect some strategies to take months to see the benefit. Google WANTS these efforts to take time. Critics of the search engine giant would point to Google wanting fast effective rankings to be a feature of Googles own Adwords sponsored listings.

If you are recovering from previous low-quality activity, it is going to take much longer to the see the benefits.

QUOTE: “Even if you make big changes with the design and the functionality, and you add new features and things, I would definitely expect that to take multiple months, maybe half a year, maybe longer, for that to be reflected in search because it is something that really needs to be re-evaluated by the systems overall.. Low-quality pages tend to be recrawled much less frequently by Googlebot – it is not unusual for six months to go by between Googlebot crawls on low quality pages or sections of a site. So it can take an incredibly long time for Google to recrawl and then reevaluate those pages for ranking purposes.” John Mueller, Google, 2020

SEO is not a quick process, and a successful campaign can be judged on months if not years. Most techniques that inflate rankings successfully end up finding their way into Google Webmaster Guidelines – so be wary.

It takes time to build quality, and it’s this quality that Google aims to reward, especially we are told during Core Updates:

QUOTE: “There’s nothing wrong with pages that may perform less well in a core update. They haven’t violated our webmaster guidelines nor been subjected to a manual or algorithmic action, as can happen to pages that do violate those guidelines. In fact, there’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.”  Danny Sullivan, Google 2020

It takes time to generate the data needed to begin to formulate a campaign, and time to deploy that campaign. Progress also depends on many factors

  • How old is your site compared to the top 10 sites?
  • How many back-links do you have compared to them?
  • How is their quality of back-links compared to yours?
  • What the history of people linking to you (what words have people been using to link to your site?)
  • How good of a resource is your site?
  • Can your site attract natural back-links (e.g. you have good content or a great service) or are you 100% relying on your agency for back-links (which is very risky)?
  • How much unique content do you have?
  • Do you have to pay everyone to link to you (which is risky), or do you have a “natural” reason people might link to you?

Google wants to return quality pages in its organic listings, and it takes time to build this quality and for that quality to be recognised.

It takes time too to balance your content, generate quality backlinks and manage your disavowed links.

Google knows how valuable organic traffic is – and they want webmasters investing a LOT of effort in ranking pages.

Critics will point out the higher the cost, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.

If you start NOW and are determined to build an online brand, a website rich in content with a satisfying user experience  – Google will reward you in organic listings.

The truth is, it’s bound to take maybe a year or two to achieve a dominant position in a very competitive niche. That’s also assuming you are fixing quality issues, improving content quality and improving page experience from the get-go.

ROI

SEO is a marketing activity just like any other and there are no guarantees of success in any, for what should be obvious reasons. There are no guarantees in Google Adwords either, except that costs to compete will go up, of course.

That’s why it is so attractive – but like all marketing – it is still a gamble.

At the moment, I don’t know you, your business, your website, your resources, your competition or your product. Even with all that knowledge, calculating ROI is extremely difficult because ultimately Google decides on who ranks where in its results – sometimes that’s ranking better sites, and sometimes (often) it is ranking sites breaking the rules above yours.

Nothing is absolute in search marketing.

There are no guarantees – despite claims from some companies. What you make from this investment is dependent on many things, not least, how suited your website is to convert visitors into sales.

Every site is different.

Big Brand campaigns are far, far different from small business campaigns that don’t have any links, to begin with, to give you but one example.

It’s certainly easier if the brand in question has a lot of domain authority just waiting to unlocked – but of course, that’s a generalisation as big brands have big brand competition too.

It depends entirely on the quality of the site in question and the level and quality of the competition, but smaller businesses should probably look to own their niche, even if limited to their location, at first.

Local is always a good place to start for small businesses.

A Real Google Friendly Website

At one time a Google-friendly website meant a website built so Googlebot could scrape it correctly and rank it accordingly.

When I think ‘Google-friendly’ these days – I think a website Google will rank top, if popular and accessible enough, and won’t drop like a f*&^ing stone for no apparent reason one day, even though I followed the Google starter guide to the letter….. just because Google has found something it doesn’t like – or has classified my site as undesirable one day.

It is not JUST about original content anymore – it’s about the function your site provides to Google’s visitors – and it’s about your commercial intent.

I am building sites at the moment with the following in mind…..

  1. Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.
  2. Think, that one day, your website will have to pass a manual review by ‘Google’ – the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google, at least classes even useful sites as spammy, according to leaked documents. If you want a site to rank high in Google – it better ‘do’ something other than exist only link to another site because of a paid commission. Know that to succeed, your website needs to be USEFUL, to a visitor that Google will send you – and a useful website is not just a website, with a sole commercial intent, of sending a visitor from Google to another site – or a ‘thin affiliate’ as Google CLASSIFIES it.
  3. Think about how Google can algorithmically and manually determine the commercial intent of your website – think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance; or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster’s particular commercial intent – hence why Google has a Top Heavy Algorithm.
  4. Google is NOT going to thank you for publishing lots of similar articles and near-duplicate content on your site – so EXPECT to have to create original content for every page you want to perform in Google, or at least, not publish content found on other sites.
  5. Ensure Google knows your website is the origin of any content you produce (typically by simply pinging Google via XML or RSS)
  6. Understand and accept why Google ranks your competition above you – they are either: more relevant and more popular, more relevant and more reputable, better user experience or manipulating backlinks better than you. Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete – relying on Google to take action on your behalf is VERY probably not going to happen.
  7. Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
  8. The basics havent changed for years – though effectiveness of particular elements has certainly narrowed or changed in type of usefulness – you should still be focusing on building a simple site using VERY simple best practices – don’t sweat the small stuff, while all-the-time paying attention to the important stuff  – add plenty of unique PAGE TITLES and plenty of new ORIGINAL CONTENT. Understand how Google SEES your website. CRAWL it, like Google does, with (for example) Screaming Frog spider, and fix malformed links or things that result in server errors (500), broken links (400+) and unnecessary redirects (300+). Each page you want in Google should serve a 200 OK header message.
  9. Use common sense – Google is a search engine – it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content-rich websites get a lot of links – especially quality links. Google ranks websites with a lot of links (especially quality links) at the top of its search engines so the obvious thing you need to do is ADD A LOT of INFORMATIVE CONTENT TO YOUR WEBSITE.
  10. I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site; some pages cannot. Some links can. Some cannot. Some links are trusted enough to pass ranking signals to another page. Some are not. YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES & FILTERS.
  11. Google engineers are building an AI – but it’s all based on simple human desires to make something happen or indeed to prevent something. You can work with Google engineers or against them. Engineers need to make money for Google but unfortunately for them, they need to make the best search engine in the world for us humans as part of the deal. Build a site that takes advantage of this. What is a Google engineer trying to do with an algorithm? I always remember it was an idea first before it was an algorithm. What was that idea? Think “like” a Google search engineer when making a website and give Google what it wants. What is Google trying to give its users? Align with that. What does Google not want to give its users? Don’t look anything like thatTHINK LIKE A GOOGLE ENGINEER & BUILD A SITE THEY WANT TO GIVE TOP RANKINGS.
  12. Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
  13. Google has a long memory when it comes to links and pages and associations for your site. It can forgive but won’t forget. WHAT RELATIONSHIP DO YOU WANT TO HAVE WITH GOOGLE? Onsite, don’t try to fool Google. Be squeaky clean on-site and have Google think twice about demoting you for the odd discrepancies in your link profile.
  14. Earn Google’s trust.
  15. Be fast.

This is a complex topic, as I said at the beginning of this in-depth article.

Important Google Webmaster Guidelines To Know

QUOTE: “I’ve collected a somewhat ordered list of the most important official Google Webmaster Guidelines and recommendations. If you want to rank high in Google, you need to follow the webmaster guidelines.” Shaun Anderson, Hobo 2020

Submit Your Website To Google Search Console

You won’t be able to operate effectively for a legitimate business website without verifying the website with Google Search Console.

Fortunately, Google has now made this simple:

QUOTE: “You can submit your website and verify it in Google Search Console. The procedure to connect your website is very simple with a little technical knowledge.” Shaun Anderson, Hobo 2020

Free Ebook

I publish all my tips in a free ebook for my subscribers:

QUOTE: “You can download my free Ebook….” Shaun Anderson, Hobo 2020

Disclaimer

Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites or any third party service. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content and services.” Shaun Anderson, Hobo