SEOmoz just updated a cool group consensus of search engine ranking factors. From that study, here’s a quick rundown on the most ‘agreed’ search engine tanking factors:
- Link Aquistion from known link sellers
- Server downtime
- Hidden text
- Links to bad neighbourhoods
- Keyword Stuffed Text
- “Excessive Repetition of the Same Anchor Text in a High Percentage/Quantity of External Links to the Site/Page”
Search Engine Tanking Factors – Algo v Human
It might have been interesting to isolate the algorithm from the human element AKA the Google Web Spam Team. I seeing a lot of old seo tricks like keyword stuffing getting past the G maths that ‘shouldn’t’ and which would not pass a human review – that is, pages ranking successfully because of individual strengths in particular/other areas even though they might be for instance, keyword stuffed to high heaven – the latter is the whole reason I wouldn’t use it for clients, not the algo.
The algo can weigh you and measure you and think you’re no1 yet when a human reaches your page all you get is keyword stuffed nonsense and that’s without cloaking anything.
I’m looking at one example of that now in a competitor audit for a customer.
Pure crap hat seo which Google loves enough to rank a particular site no1 for a keyword worth Ã‚£3K for a sale which I need to circumvent in some way. It’s obvious (but not apparent) this competitor site has done something RIGHT, it’s no1, but it’s done EVERYTHING ELSE SO BADLY it will not pass a human spam review site – it’s an embarrassment to Google as it stands.
I think it’s safe to say seo tactics that are violations of Google TOS in Google webmaster guidelines might very well …er, work lol, as long as you don’t get caught (by the algo or the web spam human reviewers). OR they wouldn’t be in there.
A lot of the SEOmoz report is standard seo Google tells you not to do, but the whole report by Rand Fishkin is a great idea (one of the best in SEO Land) and so well presented it’s worth a good look over.