Moz has a good video on the Google organic quality score theory. You should watch it. It goes into a lot of stuff I (and others) have been blogging for the last few years, and some of it is relevant to the audits I produce, an example of which you can see here (Alert! 2mb in size!).
One thing that could have been explained better in the video was that Moz has topical authority world wide for ‘Google SEO’ terms, hence why they can rank so easily for ‘organic quality score’.
But the explanation of the quality score is a good introduction for beginners.
I am in the camp this organic quality score has been in place for a long time, and more and more are feeling the results from it.
This is also quite relevant to a question answered last week in the Google Webmaster Hangout which was:
“QUESTION – Is it possible that if the algorithm doesn’t particularly like our blog articles as much that it could affect our ranking and quality score on the core Content?”
resulting in an answer:
“ANSWER: JOHN MUELLER (GOOGLE): Theoretically, that’s possible. I mean it’s kind of like we look at your web site overall. And if there’s this big chunk of content here or this big chunk kind of important wise of your content, there that looks really iffy, then that kind of reflects across the overall picture of your website. But I don’t know in your case, if it’s really a situation that your blog is really terrible.”
Google has introduced (at least) a ‘percieved’ risk to publishing lots of lower-quality pages on your site to in an effort to curb production of old-style SEO friendly content based on manipulating early search engine algorithms.
We are dealing with algorithms designed to target old style SEO – that focus around the truism that DOMAIN ‘REPUTATION’ plus LOTS of PAGES equals LOTS of Keywords equals LOTS of Google traffic.
A big site can’t just get away with publishing LOTS of lower quality content in the cavalier way they used to – not without the ‘fear’ of primary content being impacted and organic search traffic throttled negatively to important pages on the site.
Google is very probably using user metrics in some way to determine the ‘quality’ of your site.
QUESTION – “I mean, would you recommend going back through articles that we posted and if there’s ones that we don’t necessarily think are great articles, that we just take them away and delete them?”
The reply was:
JOHN MUELLER: I think that’s always an option.Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.
Deleting content is not always the optimal way to handle MANY types of low-quality content – far from it, in fact. Nuking it is the last option unless the pages really are ‘dead‘ content.
Any clean-up should go hand in hand with giving Google something it is going to value on your site e.g. NEW high-quality content:
The final piece of advice is interesting, too.
It gives us an insight into how Google might actually deal with your site:
JOHN MUELLER: “Then maybe that’s something where you can collect some metrics and say, well, everything that’s below this threshold, we’ll make a decision whether or not to significantly improve it or just get rid of it.”
You can probably rely on Google to ‘collect some metrics and say, well, everything that’s below this threshold, we’ll “…(insert punishment spread out over time).
Google probably has a quality score of some sort, and your site probably has a rating whatever that is relevant to (and if you get any real traffic from Google, often a manual rating).
If you have a big site, certain parts of your site will be rated more useful than others to Google.
Improving the quality of your content certainly works to improve traffic, as does intelligently managing your content across the site. Positive results from this process are NOT going to happen overnight. I’ve blogged about this sort of thing for many years, now.
Google are going together better at rating sites that meet their guidelines for ‘quality’ and ‘user satisfaction’ here – I am putting such things in quotes here to highlight the slightly Orwellian doublespeak we have to work with.
Google is policing their SERPs.
Put simply Google’s views on ‘site quality’ and ‘user satisfaction’ do NOT automatically correlate to you getting more traffic.
This endeavor is supposed to be a benchmark – a baseline to start from (when it comes to keywords with financial value).
Everybody, in time, is supposed to hit this baseline to expect to have a chance to rank – and for the short, to medium term, this is where the opportunity for those who take it can be found.
If you don’t do it, someone else will, and Google will rank them, in time, above you.
Google has many human quality raters rating your offering, as well as algorithms targeting old style SEO techniques and engineers specifically looking for sites that do not meet technical guidelines.
Does Google Promote A Site Or Demote Others In Rankings?
In the video above you hear from at least one spam fighter that would confirm that at least some people are employed at Google to demote sites that fail to meet policy:
“I didn’t SEO at all, when I was at Google. I wasn’t trying to make a site much better but i was trying to find sites that were not ‘implementing Google policies'(?*) and not giving the best user experience.” (I can’t quite make out what he says there)
Link algorithms seem particularly aggressive, too, and ‘delayed’ now more than ever (for normal businesses built on old-school links) so businesses are getting it from multiple angles as Google rates the quality of its primary index (which may well sit on top of the old stuff and heavily influenced by ‘quality’ signals on your site).
The ‘Quality Metric’
Google is on record as saying they had a quality problem that they started fixing with Panda.
That means that many of the sites that were getting a lot of traffic from Google in 2011 were not going to be rated ‘high quality’ as the new quality rating was designed to demote these sites.
In the following video, a Google spokesperson even goes into what they call a ‘quality metric’ signals separated from relevance signals:
These sites used common seo practices to rank, and Google made those practices toxic.
The fun started when these algorithms started to roll out more significantly to the real business world, and continue today with the ever present (often erroneously confirmed) Google ‘quality’ updates we now constantly hear about in the media.
One thing is for sure, you need to get low-quality content OFF your site and get some UNIQUE content on it to maximise the benefit a few months down the line from *any* other marketing activity (like link building).
From my testing, the performance of pages degrade in time but can come back with regular updates and refresh of content in a sensible manner.
Retargeting keywords can also have a positive impact over time. One marked observation between my top performing pages on my test site is that pages that are updated more often (and improved substantially) tend to get more traffic than pages that do not.
A sensible strategy in 2017 is to:
- Get rid of all substandard pages on the site
- Meet Googles technical recommendations
- Meet rater guidelines high priority issues for low to medium ratings
- Go in-depth on your topic in certain areas
- Consolidate content and link equity where prudent with redirects (not canonicals – use these only as a patch for a few months)
- Start producing some sort of actual unique content
- Market that new content more appropriately going forward than in the past
Most of this stuff I cover in my seo audits.
Specific Advice From Google On Low-Quality Content On Your Site
And remember the following, specific advice from Google – back in 2011 on the original blog post about dealing with Google Panda style algorithms – and specifically on removing low-quality content on a domain:
******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE******
This stuff is clearly not going away:
They are not lying. Here is another example of taking multiple pages and making one better high-quality page in place of them (and consoldiating them properly):
I go into this sort of thing more in this article about making SEO friendly pages for Google in 2017.