Internal Link Building: How To Optimise A Website Using Internal Links

Blog subscribers

Link To Important Pages

QUOTE: “How important is the anchor text for internal links? Should that be keyword rich? Is it a ranking signal? We do use internal links to better understand the context of content on your site so if we see a link that’s saying like red car ispointing to a page about red cars that helps us to better understand that but it’s not something that you need to keyword stuff in any way because what generally happens when people start kind of focusing too much on the internal links is that they have a collection of internal links that all say have like four or five words in them and then suddenly when we look at that page we see this big collection of links on the page and essentially that’s also text on a page so it’s looking like keyword stuff text so I try to just linknaturally within your website and make sure that you kind of have that organic structure that gives us a little bit of context but not that your keyword stuffing every every anchor text there.” John Mueller, Google 2015

Table of Contents

An introduction to ‘Internal Link Building

QUOTE: “Most links do provide a bit of additional context through their anchor text. At least they should, right?” John Mueller, Google 2017

Whereas Linkbuilding is the art of getting other websites to link to your website. Internal linkbuilding is the art of getting pages indexed by Google and also highlighting important content on your site in a way that actually has a positive SEO benefit for specific keyword phrases in Google SERPs (Search Engine Results Pages).

Traditionally, one of the most important things you could do on a website to highlight your important content was to link to important pages often, especially from important pages on your site (like the homepage).

Highlighting important pages in your site structure has always been important to Google from a CRAWLING, INDEXING and RANKING point of view – and its important for website users from a USABILITY, USER EXPERIENCE and CONVERSION RATE  point of view.

Most modern CMS in 2018 take a headache out of getting your pages crawled and indexed. Worrying about your internal navigation structure (unless it is REALLY bad) is not going to cause you major problems from an indexation point of view.

There are other considerations though.

I still essentially still use the methodology I have laid down on this page, but things have changed since I first started in Google SEO and started building internal links to pages almost 20 years ago.

Google has said it doesn’t matter where the links are on your page, Googlebot will see them:

QUOTE: “So position on a page for internal links is pretty much irrelevant from our point of view.  We crawl, we use these mostly for crawling within a website, for understanding the context of individual pages within a website.  So if it is in the header or the footer or within the primary content, it’s totally more up to you than anything SEO wise that I would worry about.” John Mueller, Google 2017

That statement on its own does not sit nicely with some patents I’ve read, where link placement does seem to matter in some instances.

When it comes to internal linking on your website:

  • where you use links on a page is important for users
  • where you link to on your website is important for users
  • how you link to internal pages is important for users
  • why you link to internal pages is important for users

Internal Linking is important to users, at least, and it’s of some importance to Google, too.

An analogy

I used a ‘links-are-lasers’ analogy way back then, to try and give beginners a simpler understanding of Google PageRank.

  1. Links Are Lasers
  2. Linking To A Page Heats Up A Page
  3. Pages Get Hot Or Cold Depending On Number & Quality Of The Links To It
  4. Cold Pages Don’t Rank For Sh*t
  5. Hot Pages Rank!

That was certainly how I used to think about link building and internal site structure. That is how I used to visualise how pages built up ‘ranking equity’ that could be spread about a site.

There was a time when you could very specifically structure a certain page to rank using nothing but links – and while you can still do that in 2018 in the end, Google will pick the page on your site that is MOST RELEVANT TO THE QUERY and best meets USER EXPECTATIONS & USER INTENT (see here for more on developing seo friendly content for Google in 2018).

That is – you can link all you want to any one page, but if Google has a problem with that page you are trying to make rank, or thinks there’s a better page on your site (with a better user satisfaction score, for instance) – it will choose to rank that other page, before the ‘well-linked-to’ page.

In the past, Google would flip-flop between pages on your site, when there were multiple pages on the site targeting the same term, and rankings could fluctuate wildly if you cannibalised your keywords in this way.

Google is much more interested, in 2018, in the end-user quality of the page ranking, and the trust and quality of the actual website itself, than the inbound links pointing to a single page or a clever internal keyword rich architecture that holds content ‘up’.

Internal linkbuilding works best when it is helping Google identify canonical pages to rank on your site.

As John Mueller points out in the above official video:

QUOTE: “we do use internal links to better understand the context of content of your sites” John Mueller, Google 2015

…but if you are putting complicated site-structure strategy before high-quality single-page content that can stand on its own, you are probably going to struggle to rank in Google organic listings in the medium to long-term.

So the message is a keyword rich anchor text system on your site IS useful, and is a ranking signal, but don’t keyword stuff it.

I have always taken that to mean we should focus on introducing as many unique and exactly relevant long-tail keyword phrases into your internal link profile as you can. This has certainly had better results for me than having one page on your site having only one anchor text phrase in its profile.

How you proceed is going to be very much dictated by the site and complexity of your site, and how much time you are willing to spend on this ranking signal.

There is no single best way to build internal links on your site, but there are some efficiencies to be had, especially if your site is of a good quality in the first place. There are some really bad ways to build your site for search engines. For example, do not build your website with frames.

The 3 Click Rule of Website Design

RULE: “Don’t put important information on your site that is more than 3 clicks away from an entrance page

Many have written about the Three-Click Rule. For instance, Jeffrey Zeldman, the influential web designer, wrote about the three click Rule in his popular book, “Taking Your Talent to the Web“. He writes that the three click Rule is:

QUOTE: “based on the way people use the Web” and “the rule can help you create sites with intuitive, logical hierarchical structures”. Jeffrey Zeldman

On the surface, the Three-Click Rule makes sense. If users can’t find what they’re looking for within three clicks, they’re likely to get frustrated and leave the site.

However, there have been other studies into the actual usefulness of the 3 click rule by usability experts, generating real data, that basically debunks the rule as a gospel truth. It is evidently not always true that a visitor will fail to complete a task if it takes more than 3 clicks to complete.

The 3 click rule is the oldest pillar of accessible, usable website design, right there beside KISS (Keep It Simple Stupid).

The 3 click rule, at the very least, ensures you are always thinking about how users get to important parts of your site before they bounce.

QUOTE: “home pages” are where “we forward the PageRank within your website” and “depending on how your website is structured, if content is closer to the Home page, then we’ll probably crawl it a lot faster, because we think it’s more relevant” and “But it’s not something where I’d say you artificially need to move everything three clicks from your Homepage”. John Mueller, Google 2014

This is the click depth of my content on this website in Jan 2018:

Click depth of internal links on the Hobo website

The Benefits of A Consistent Website Navigation & Page Layout

A key element of accessible website development is a clean, consistent navigation system coupled with a recognised, usable layout.

Don’t try and re-invent the wheel here. A clean, consistent navigation system and page layout allow users to instantly find important information and allows them to quickly find comfort in their new surroundings especially if the visitor is completely new to your website.

Visitors don’t always land on your home page – every page on your website is a potential landing page.

Ensure when a visitor lands on any page, they are presented with simple options to go to important pages you want them to go to. Simple, clear calls to action that encourage a user to visit specific pages. Remember too, that just because you have a lot of pages on your site, that does not mean you need a mega-menu. You do not need to give visitors the option to go to every page from their entry page. You do not need a massive drop down menu either. Spend the time and invest in a simple site navigation menu and a solid site structure.

A traditional layout (2 or 3 columns, with a header and a footer) is excellent for accessible website design, especially for information sites.

Remember to use CSS for all elements of style, including layout and navigation.

QUOTE: “Presentation, content and navigation should be consistent throughout the website” Guidelines for UK Government websites – Illustrated handbook for Web management teams

What is Anchor Text?

What is anchor text?

Definition:

Words, typically underlined on a web page that form a clickable link to another web page. Normally the cursor will change to a finger pointing if you hover over such a link.

HTML code example:

 <a href="https://www.hobo-web.co.uk/">This is anchor text!</a>

Use Descriptive Anchor Text – Don’t Use ‘Click Here’

QUOTE: “When calling the user to action, use brief but meaningful link text that: 1) provides some information when read out of context 2) explains what the link offers 3) doesn’t talk about mechanics and 4) is not a verb phrase” W3C

The W3C advise “don’t say click here” and professional SEO professionals recommend it, too.

If you use link text like “go” or “click here,” those links will be meaningless in a list of links. Use descriptive text, rather than commands like “return” or “click here.”

For example, do not do this:

"To experience our exciting products, click here."

This is not descriptive for users and you might be missing a chance to pass along keyword rich anchor text votes for the site you’re linking to (useful to rank better in Google, Yahoo and MSN for keywords you may want the site to feature for).

Instead, perhaps you should use:

"Learn more about our search engine optimisation products."

Assistive technologies inform the users that text is a link, either by changing pitch or voice, or by prefacing or following the text with the word “link.”

So, don’t include a reference to the link such as:

"Use this link to experience our exciting services."

Instead, use something like:

"Check out our SEO services page to experience all of our exciting services."

In this way, the list of links on your page will make sense to someone who is using a talking browser or a screen reader.

NB – This rule applies in web design when naming text links on your page and in your copy. Of course, you can use click here in images (as long as the ALT tag gives a meaningful description to all users).

QUOTE: “One thing to think about with image links is if you don’t have an alt text for that then you don’t have any anchor text for that link. So I definitely make sure that your images have alt text so that we can use those for for an anchor for links within your website. If you’re using image links for navigation make sure that there’s some kind of a fallback for usability reasons for users who can’t view the images.” John Mueller, Google 2017

If that wasn’t usable enough, Google ranks pages, part, by these text links, so it is worth making your text links (and image ALT text) descriptive!

How I Usually Managed Internal Linking

I focused on optimising the main pages in the structure e.g. the pages we need to rank fast (esp on a new site).

I prioritised internal links to these pages (all the time remembering first link priority) – and it appeared, by doing so, Google did judge these pages as ‘important’ pages on my site – at least from a links-count pagerank-type point of view.

By making sure you linked to other relevant pages from these pages, you spread the LINK EQUITY as it became more commonly known, throughout the site. I would also always link to important pages from a home page, especially if these pages are not likely to attract many natural links by themselves.

A home page is where link equity seemed to ‘pool’ (from the deprecated Toolbar PageRank point of view) and this has since been confirmed by Google:

QUOTE: “home pages” are where “we forward the PageRank within your website” . John Mueller, Google 2014

The above process helped your entire site rank for a lot of keywords.

You could achieve benefits with secondary navigation arrays (which can be a good user experience signal in 2018) and links in text content.

How you build internal links on your site today is going to depend on how large your site is and what type of site it is. Whichever it is – I would keep it simple in 2018.

If you have a smaller site, I would still err on the safe side these days, but vary your anchor text to internal as much as possible – WITHIN TEXT CONTENT, and to meet longtail variations of keywords with specific user intent, rather than relying on a site-wide navigation array to beef up raw links to every page on the site.

Whatever you do, avoid anything that is easily detectable as too manipulative –  Google does not reward lazy linking in 2018.

It PENALISES IT OR IGNORES IT.

Does Google Count Keywords in Anchor Text In Internal Links? YES

ADVICE WARNINGThis is part of a series of SEO ranking tests that look at traditional ranking signals and their effects on SERP positions. I would certainly NOT go and implement sitewide changes to your site based on this or other recent posts. WAIT for the series to END so you can make a more informed decision based on more observations, because **things get a bit weird later in the series of tests**! AS WITH ANY SEO TESTING – don’t just rely on my findings. Test things yourself on your own site, and realise, that there are not many QUICK WINS that Google isn’t policing in some way so ** avoid doing anything on your site that would leave a footprint that would indicate manipulation. **WARNING

Recently I looked to see if Google counts keywords in the URL to influence rankings for specific keywords and how I investigated this.

Today I am looking at the value of an internal link and its impact on rankings in Google. These posts were originally published in two parts, but I have folded them together to make it easier to read.

My observations from these tests (and my experience) include:

  • witnessing the impact of removing contextual signals from the anchor text of a single internal link pointing to a target page (April 15 impact in the image below)
  • watching as an irrelevant page on the same site take the place in rankings of the relevant target page when the signal is removed (19 April Impact)
  • watching as the target page was again made to rank by re-introducing the contextual signal, this time to a single on-page element e.g. one instance of the keyword phrase in exact match form (May 5 Impact)
  • potential evidence of a SERP Rollback @ May 19/20th
  • potentially successfully measuring the impact of one ranking signal over another (a keyword phrase in one element via another) which would seem to slighly differ from recent advice on MOZ, for instance.

As a result of some of this testing, and my experience in cleaning up sites, I present a theory (or at least a metaphor) on the Google Quality Metric and how it might be deployed.

Will Google Count Keywords in Internal Anchor Text Links?

The video above, from quite some time ago now (OCT 2015), is John Meuller talking about ‘Internal Links’ and their influence on Google SERPS.

we do use internal links to better understand the context of content of your sites

Essentially my tests revolve around ranking pages for keywords where the actual keyphrase is not present in exact match instance anywhere on the website, in internal links to the page or on the target page itself.

The relevance signal (mentions of the exact match keyword) IS present in what I call the Redirect Zone – that is – there are backlinks and even exact match domains pointing at the target page but they pass through redirects to get to the final destination URL.

In the image below where it says “Ranking Test Implemented” I introduced one exact match internal anchor text link to the target page from another high-quality page on the site – thereby re-introducing the ‘signal’ for this exact match term on the target site (pointing at the target page).

Where it says ‘Test Removed‘ in the image below, I removed the solitary internal anchor text link to the page, thereby, as I think about it, shortcutting the relevance signal again and leaving the only signal present in the ‘redirect zone’.

Graph: Does Google Count Anchor Text In Internal Links?

It is evident from the screenshot above that something happened to my rankings for that keyword phrase and long tail variants exactly at the same time as my tests were implemented to influence them.

Over recent years, it has been difficult for me, at least, to pin down, with any real confidence anchor text influence from internal pages on an aged domain. Too much is going on at the same time, and most out of an observers control.

I’ve also always presumed Google would look at too much of this sort of onsite SEO activity as attempted manipulation if deployed improperly or quickly, so I have kind of just avoided this kind of manipulation and focused on improving individual page quality ratings.

TEST RESULTS

  1. It seems to me that, YES,  Google does look at keyword rich internal anchor text to provide context and relevance signal, on some level, for some queries, at least.
  2. Where the internal anchor text pointing to a page is the only mention of the target keyword phrase on the site (as my test indicates) it only takes ONE internal anchor text (to another internal page)  to provide the signal required to have NOTICEABLE influence in specific keyword phrase rankings (and so ‘relevance’).

——————————————————-

Test Results: Removing Test Focus Keyword Phrase from Internal Links and putting the keyword phrase IN AN ELEMENT on the page

Screenshot 2016-05-25 15.43.32

To recap in my testing:

I am seeing if I can get a page to rank by introducing and removing individual ranking signals.

Up to now, if the signal is not present, the page does not rank at all for the target keyword phrase.

I showed how having a keyword in the URL impacts rankings, and how having the exact keyword phrase in ONE internal anchor text to the target page provides said signal.

Ranking WEIRDNESS 1: Observing an ‘irrelevant’ page on the same site rank when ranking signal is ‘shortcutted’.

The graph above illustrates that when I removed the signal (removed the keyword from internal anchor text) there WAS a visible impact on rankings for the specific keyword phrase – rankings disintegrated again.

BUT – THIS TIME – an irrelevant page on the site started ranking for a long tail variant of the target keyword phrase during the period when there was no signal present at all in the site (apart from the underlying redirect zone).

Screenshot 2016-05-25 15.36.05

This was true UNTIL I implemented a further ranking test (by optimising ANOTHER ELEMENT actually on the page this time, that introduced the test focus keyword phrase (or HEAD TERM I have it as, in the first image on this page) again to the page – the first time that the keyword phrase was present on the actual page (in an element) for a long time).

WEIRDNESS 2 – SERP Rollback?

On May 1st I added the test focus keyword to the actual page in a specific element to test the impact of having the signal ONLY in a particular element on the page.

As expected the signal provided by having the test keyword phrase ONLY in one on-page element DID have some positive impact (although LESS than the impact when the signal was present in Internal Links and this comparison I did find very useful).

That’s not the anomaly – that results from RANKING TEST 3 were almost exactly as I expected. A signal was recognised, but that solitary signal was not enough to make the page as relevant as it was to Google when the signal was in internal links.

The weirdness begins on May 17, where I again removed the keyword phrase from the target page. I expected with NO SIGNAL present anywhere on the site or the page, Google rankings would return to their normal state (zero visibility).

The opposite happened.

Screenshot 2016-05-25 15.36.20

WTF?

Rankings returned to the best positions they have been for the term SINCE I started implemented these ranking tests – even WITHOUT any signal present in any of the areas I have been modifying.

Like a memory effect, the rankings I achieved when the signal was present only in internal links (the strongest signal I have provided yet) have returned.

THINKING OUT LOUD

It’s always extremely difficult to test Google and impossible to make any claims 100% one way or another.

The entire ecosystem is built to obfuscate and confuse anyone trying to understand it better.

Why have rankings returned when there is no live signal present that would directly influence this specific keyword phrase?

My hunch is that this actually might be evidence of what SEOs call a SERP ROLL-BACK – when Google randomly ‘rolls’ the set of results back to previous weeks SERPs to keep us guessing.

If this is a roll back the roll back time frame must be during the period of my RANKING TEST 2 (a month or so maximum) as the page did not rank for these terms like this at all for the year before – and yet they come back almost exactly as they were during my test period.

In the following image, I show this impact on the variant keyword (the keyword phrase, with no spaces) during this possible ‘roll back’.

Screenshot 2016-05-25 23.58.46

An observation about SERP ROLL BACKS.

If a rollback is in place, it does not seem to affect EVERY keyword and every SERP equally. NOT all the time, at least.

MORE IMPORTANTLY – Why did an irrelevant page on the same website rank when the signal was removed?

The target page was still way more relevant than the page Google picked out to present in long tail SERPs – hence my question.

Google was at least confused and at worst apathetic and probably purposefully lazy when it comes to the long-tail SERPs.

Because my signal to them was not explicit, ranking just seemed to fail completely, until the signal was reintroduced and I specifically picked out a page for Google to rank.

To be obvious, something else might be at play. Google might now be relying on other signals – perhaps even the redirect zone – or relative link strength of the ‘irrelevant’ page – but no matter – Google was ranking the less relevant page and CLEARLY IGNORING any relevance signals passing through the redirect zone to my target page.

Observations

From my ranking test 2 (internal links) – it is evident that modifications to internal links CAN make IRRELEVANT PAGES on your site rank instead of the target page, and for some time, IF by modifying these internal links, you SHORTCUT the signal that these links once provided to the target page in a way that entirely removes that signal from the LIVE signals your site provide for a specific keyword phrase (let’s call this in the “CRAWL ZONE” e.g. what can be picked up in a crawl of your HTML pages, as Google would do – which sits above the REDIRECT ZONE in my imagination – which is simply where signals need to pass through a 301 redirect).

That test was modifying only ONE anchor text link.

This might be very pertinent to site migrations when you are modifying hundreds of links at the same time when you need to migrate through redirects and change of URLs and internal anchor text.

YES – the signal may return – but this doesn’t look to anything that happens over a quick timescale. Site migrations like this are potentially going to be very tricky if low-quality pages are in the mix.

Which Ranking Signal Carries the most weight?

In these two tests, I switched the signal from internal links to another element, this time on the page, to observe the impact on the change in terms of rankings for the page for the test focus keyword phrase.

Screenshot 2016-05-25 21.29.04

The signal I switched it to would seem to have less of an impact on ranking than internal links, therefore making a recent whiteboard Friday at least potentially inaccurate in terms of weighting of signals for relevance as they are presented on the whiteboard.

This would be UNLESS I have misinterpreted the presumed rollback activity in the SERPs in May 2016 and those rankings are caused by something else I am misunderstanding. Only time will shed some light on that, I think.

Screenshot 2016-05-25 21.15.00

Yes – I replaced the signal originally in H (in Rand’s list) in an element that was in A to G (on Rands list) – and the result was to make the page markedly LESS relevant, not more.

Which element did I switch it with?

That answer comes later in this series of tests. But to be clear – internal links don’t look to be last in this hierarchy of keyword targeting a page.

PS – To understand this test fully, you really need to read my posts where I investigate: Is a keyword in the URL a ranking factor?

Site Quality Algorithm THEORY

If you have site quality problems, then any help you can get will be a good thing.

Any advice I would offer, especially in a theory, would really need to be sensible, at worst! I have clearly shown in past posts that simply improving pages substantially clearly does improve organic traffic levels.

This is the sort of thing you could expect in a ‘fairish’ system, I think, and we apparently have this ‘fairness’, in some shape or form, baked in.

If you improve INDIVIDUAL pages to satisfy users – Google responds favourably by sending you more visitors – ESPECIALLY when increased user satisfaction manifests in MORE HIGH-QUALITY LINKS (which are probably still the most important ranking signal other than the content quality and user satisfaction algorithms):

Screenshot 2016-05-26 01.01.06

In these SEO tests I have been DELIBERATELY isolating a specific element that provides the signal for a specific keyword phrase to rank and then SHORTCUTTING it, like in an electrical switch, to remove the signal to the target page.

What if this process is how Google also shortcuts your site from a site quality point of view e.g. in algorithm updates?

Let us presume you have a thousand pages on your site and 65% of them fail to meet the quality score threshold set for multiple keyword phrases the site attempts to rank for – a threshold Google constantly tweaks, day to day, by a fractional amount to produce flux in the SERPs. In effect, most of your website is rated low-quality.

Would it be reasonable to presume that a page rated low-quality by Google is neutered in a way that it might not pass along the signals it once did to other pages on your site, yet still remain indexed?

We have been told that pages that rank (e.g. be indexed) sometimes do not have the ability to transfer Pagerank to other pages.

Why would Google want the signals a low-quality page provides, anyways, after it is marked as low-quality or more specifically not preferred by users?

SEOs know that pages deemed extremely low-quality can always be deindexed by Google – but what of pages that Google has in their index that might not pass signals along to other pages?

It would be again reasonable to suggest, I think, that this state of affairs is a possibility – because it is another layer of obfuscation – and Google relies on this type of practice in lots of areas to confuse observers.

SO – let us presume pages can be indexed but can sometimes offer no signals to other pages on your site.

If Google’s algorithms nuke 65% of your pages ability to relay signal to other pages on your site, you have effectively been shortcutted in the manner I have illustrated in these tests  – and that might end up with a state of affairs where irrelevant pages on your website will start to rank in place of once relevant pages, because the pages that did provide the signal no longer pass the quality bar to provide context and signal to your target pages (in the site structure).

Google has clearly stated that they:

use internal links to better understand the context of content of your sitesJohn Meuller

If my theory held water, there would be lots of people on the net with irrelevant pages on their site ranking where other, relevant pages once did – and this test would be repeatable.

If this was true – then the advice we would get from Google would be to IMPROVE pages rather than just REMOVE them, as when you remove them, you do not necessarily reintroduce the signal you need and you would get if you IMPROVED the page with the quality issues e.g. the ideal scenario in a world with no complication.

Especially if webmasters were thinking removing pages was the only answer to their ranking woes – and I think it is fair to say many did, at the outset of this challenge.

Guess what?

Screenshot 2016-05-25 19.32.25

This would make that statement by Google entirely correct but monumentally difficult to achieve, in practice, on sites with lots of pages.

Site Quality punishment is a nightmare scenario for site owners with a lot of low-quality pages and especially where it is actually CONTENT QUALITY (as opposed to a purely technical quality issue) that is the primary issue.

It is only going to get acuter a problem as authorship, in however form Google assigns it, becomes more prevalent e.g. you can have high-quality content that is exactly what Google wants, but this will be outranked by content from authors Google wants to hear from e.g. Danny Sullivan over the rest of us, as Matt Cutts was often fond of saying.

There is incredible opportunity ahead for authors with recognized topical expertise in their fields. Google WANTS you to write stuff it WILL rank. WHO writes your content on your website might be even more important that what is written on your page (in much the same way we recognised what we called ‘domain strength’ where Google gave such ability to rank to domains with a lot of ‘link juice’. To be clear – I still think a lot of the old stuff is still baked in. Links still matter, although ‘building’ external backlinks is probably going to be self-defeating and perhaps crushing in future.

Theoretically, fixing the content quality ‘scores’ across multiple pages would be the only way to reintroduce the signal once present on a site impacted by Google Panda or Site Quality Algorithms, and this would be, from the outset, an incredibly arduous – almost impossible – undertaking for larger sites – and AT LEAST a REAL investment in time and labour – and again – I think there would be lots of sites out there in this sort of scenario, if my theory held water, and you accepted that low-quality content on your site can impact the rankings for other pages.

Google actually has said this in print:

low-quality content on part of a site can impact a site’s ranking as a whole” GOOGLE

It is probably a nearly impossible task for content farms with multiple authors of varying degrees of expertise in topics – mostly zilch – with the only way I see of recovering from that would be at best distasteful and at worst highly unethical and rather obvious so I won’t print it.

Let’s look at that statement in full, from Google, with emphasis and numbers added by me:

One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus 1. removing low quality pages, 2.merging or 3.improving the content of individual shallow pages into more useful pages, or4. moving low quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE

To recover from Google Panda and site quality algorithms, unless you are waiting for Google to release a softer Panda…. you really need to focus doing ALL of points 1-3 – but lots of webmasters stop at number 1 thinking that will be sufficient when any SEO with any experience knows that is way to simple for what Google wants this entire process to achieve – to take TIME – and, accusations have been made, in many a forum to drive the cost of organic SEO UP to comparable levels and beyond of Adwords.

Just recently I advised a client to do no.4 (4. moving low-quality pages to a different domain) and move an old blog with zero positive signal for the business to another domain to expedite the ‘this content isn’t here anymore – don’t rate my site on this‘ process for re-evaluation by Google.

Site quality problems are, BY DESIGN, MEANT to take a long time to sort out – JUST LIKE GOOGLE PENGUIN and the clean up of unnatural links – but, contrary to the complaints of many webmasters who accuse Google of being opaque on this subject, Google tells you exactly how to fix Google Panda problems and Matt Cutts has been telling people all along to “focus on the user” – which again is probably an absolute truth he can feel he is morally correct in relaying to us (which many guffawed at as lies).

If a site quality algorithm was deployed in this fashion, then punishment would be relative to the infraction and cause the maximum amount of problems for the site owner relative to the methods used to generate rankings. All that once helped a site rank could be made to demote it and hold it under the water, so to speak. In a beautiful system, I think, you would actually penalise yourself not so much Google penalising you, and users would indeed determine the final ranking order in organic SERPs not being ‘sharded’ by Google for their own benefit.

We would, of course, need to assume Google has a ‘Quality Metric‘ separate of relevance signals that is deployed in this fashion.

Guess what?

If your site is impacted by this shortcut effect, then identifying important pages in your hierarchy and user journey and improving them is a sensible way to proceed, as you may well be providing important signals for other pages on your site, too.

Why does Amazon rank for everything during these updates? That is the accusation, at least, and this theory would have an answer.

I would presume that when you remove an important signal from your website, you don’t have many other pages that provide said signals. Amazon ALWAYS has higher quality multiple pages on the same topic and so other signals to fall back on and EASY for an algorithm not to f*&^ up. Amazon, too, probably has every other positive signal in bucket loads too, let’s not forget.

Site quality algorithms deployed in this manner would be a real answer to a wayward use of ‘domain authority’, I’ve long thought.

What about webmasters who have, in good faith, targeted low-quality out of date content on a site and removed it, in order to combat Panda problems?

This was natural after Google said in effect to clean up sites.

I imagine somewhere in Google’s algorithm there is a slight reward for this activity – almost as if Google says to itself – “OK, this webmaster has cleaned up the site and brought the numbers of lower quality pages down on the site, thereby incrementally improving quality scores and so traffic levels we will allow it” – but NOT to the extent that would ever bring back traffic levels to a site hit by Content Quality Algorithms (after May 2015, especially).

Google, I think, must seek to reward white hat webmasters (on some level) if the intent is to adhere to the rules (even if those recommendations have been slightly misunderstood) or what is the point of listening to them at all? Most distrust Google and most evidently consistently fail to understand the advice given.

Again – if my theory held water, there would be a lot of webmasters who spent a lot of time cleaning up sites to comply with Panda who DO see positive numbers month to month in terms of increased organic traffic to a site – but rarely do they see a quick return to former ranking glory WITHOUT a severe investment in sitewide page quality improvement.

I have certainly observed this.

From my own experience – you cannot just delete pages to bring traffic levels back to a site in the same numbers after a Panda algorithm change that impacts your site. You must also improve remaining content, substantially.

It is baked into the system that if you have ranked with low-quality techniques, it is going to take a monumental effort deployed quickly to get your site moving again.

Confirmation Bias? Conspiracy theory?

You can tell me.

This theory posits that the site quality metric can shortcut your site and cause irrelevant pages to rank, and maybe even relevant pages to rank lower than they could if ALL the pages on the site where high quality.

Is this theory just wishful thinking because I currently sell site quality audits and my research revolves around white hat SEO testing? I’ve been educating myself to entity optimisation, too. I have loved SEO for over 15 years, but site quality optimisation and sifting through the rubble of toxic backlinks is not exactly what I signed up for, even as my services adjusted to the changes in Google since 2013/14 as I perceived them.

I can confirm I offer this information in return for building my topical relevance on my subject – that is my primary motive. That has always been valuable. I perceive this to be important going forward. Publishing erroneous or harmful theories wouldn’t achieve that, for me.

In my audits, I basically prioritise tasks by impact and risk factor that a webmaster would need to address to achieve long-term rankings. At the same time, I need to educate people as to why exactly their site is probably rated garbage-level.

I’ve deployed these tactics on this very site over the last few years to see if it drove traffic (and this is why I do SEO the way I currently do it):

Screenshot 2016-05-25 23.37.34

SEO has become part of the legitimate long-term marketing mix with quick or even manipulated results at least a potentially business damaging exercise.

Conversely, though, if you achieve ranking nirvana through improved legitimate site and content quality efforts, you can be rest assured the effort required to dislodge you is probably going to be relative to the effort you put in (if applied correctly in the first place) and a great barrier to entry for all but your most dedicated competitors.

On that point, those that do rank and Google themselves are more than happy with that scenario.

I did like when SEO was fast, gratification was instant and risk was a distant prospect. Today risk is always seemingly close by, and gratification becomes increasingly a longer prospect.

I am wondering if the black hat SEO tests might be more fun.

How To Proceed

Improve pages in the user journey. Try to NOT present low-quality pages to users.

I think it reasonable to say that identifying DEAD PAGES is still an incredibly important first step but how to handle the challenge from there is of equal importance.

Removal of irrelevant, out of date obviously low-quality content on a domain should still be a prerequisite for most webmasters. Canonical’s and redirects are STILL your VERY BEST FRIEND when merging ANY PAGES (but pay close attention to Google dicking about with your redirect chains are all your work goes in the toilet).

Paying close attention to where important signals lie on your site is the only way to attempt to protect them, especially during periods of change, where Google is happy to shortcut your ranking ability.

If you fail to preserve certain signals during changes, you remove signals, and irrelevant pages can rank instead of relevant pages.

If you ultimately don’t improve pages in a way that satisfies users, your quality score is probably coming down, too.

Is The Amount Of Words in Anchor Text Important?

The last time I test this was a long time ago.

I wanted to see if there was a maximum limit of keywords Google will pass to another page through a text link. How many words will Google count in a backlink or internal link? Is there a best practice that can be inferred?

My first qualitative tests were basic and flawed but time and again other opportunities for observation indicated the maximum length of the text in a link was perhaps 8 words (my first assumption was a character limit, maybe 55 characters, e.g not a word limit, but that was flawed).

I have had a look at this a few times now. I had a look at it again recently (about a month ago). Further observations at the time, (described below) pointed to keeping the important keywords in the first EIGHT WORDS of any text link to make sure you are getting the maximum benefit.

Finding that long link

I noticed (below) that another site linked to me using a ‘long’ rich anchor text (well, 14 words in the link text):

This page links to Hobo with a 14 word anchor text string

I thought this was a good opportunity to check how many words Google counts in an anchor text rich link (for search engine optimisation and linkbuilding purposes). I’ve long thought it was about 8 words and these results seem to back that up.

8 words. Yes. 9 words. No.

I took screenshots:

8 words do appear to pass anchor text value to the other page in this test

Eight words flowed through the link in the example above so that the destination page ranks for the query….but there are more than 8 words in the link and, as expected, when I searched for a 9-word term within the link, the page is NOT returned:

Page not found for anchor text of 9 words and over

Exactly as expected.

So I thought Google was ignoring keywords after the eight word in a link. I cleaned up (this) article to max that apparent. I didn’t really have an intention to publish the article to the newsletter again.

Google Webmaster Tools Data

As I looked at Google Webmaster Tools data, I could see occasions where it certainly appeared that Google cut off the keywords at the 8 word limit. I am aware that could be to a number of reasons too.

I only found longer versions of it from when the link to Hobo was in the ALT text. You can put 16 keywords in ALT text for Google SEO – and at the time, in a couple of tests, it seemed to me you could fit double the keywords in an ALT text link than you could in a basic HTML text link.

 

The anchor text is not limited by characters, but by WORDS.

Perhaps, if you want the full SEO benefit,  you should aim to include your important keywords in the first 8 words of any internal links to your page you link to.

Later Observation

So I checked the data above and here’s the screenshot today:

Screen Shot 2013-04-28 at 00.26.37

AND when I checked todays Google Webmaster Tools backlink data… I saw a LOT more links with MORE than 8 – to a maximum of 16.

I am also seeing this same result on internal links. Google seems to be counting anchor text in longer links in both internal and external links?

That is different to my observations over the last few years with that data. Before that day, I’ve never got that result in any of my random sampling. I’ve never seen a link with more than 8 words pass all the keywords to the recipient page using that simple observation test before. There’s other things that could be at play and I dont have time to investigate. Perhaps things have changed, perhaps things have changed with the sites in question, perhaps it’s random and I am only seeing one pattern. I could be polluting my data myself. Site-specific? Google has obviously made all this observation stuff difficult for a SEO to work out, removing supplemental results and links only appear in links to this page and other little helpful indicators.

Anyway that’s the observation. The question is – did Google make a change recently when handling anchor text links? Why? What impact would that have had?

I always try and keep a text link to 8 words maximum as a best practice unless it was image alt text, where I would use 16, and I still will, if only to advise people work a bit harder to be sensible when linking up their content. I still aim to fit important keywords in the first 8 words of a page title, in case it is linked to, if I am lucky.

Anchor Text Abuse

Be VERY careful abusing anchor text, in internal links and in backlinks to your site. For more – see my article on unnatural links penalty recovery.

Does Google Count Internal Keyword Rich Links To Your Home Page?

The last time I tested this was a long time ago.

A long time ago I manipulated first link priority to the home page of a site for the site’s main keyword – that is, instead of using ‘home‘ to link to my homepage, I linked to the home page with “insert keyword“). Soon afterwards the site dropped in rankings for its main term from a pretty stable no6 to about page 3 and I couldn’t really work out exactly any other issue.

Of course, it’s impossible to isolate if making this change was the reason for the drop, but let’s just say after that I thought twice about doing this sort of SEO ‘trick‘ in future on established sites (even though some of my other sites seemed to rank no problem with this technique).

So I formulated a little experiment to see if anchor text links had any impact on an established home page (in as much a controlled manner as possible).

*****Setup – I EDITED THIS A BIT: Basically the search term (anchor text) I was looking for was “‘keyword1 (not present on the target page) + ‘keyword2 ‘” -(minus) ‘keyword3 also not on target page but a common word that would accompany keyword1 ‘” – I did this to get rid of a lot of noise in the SERPs for more relevant pages to the original keyword phrase. Also this keyword (keyword1) appeared in anchor text  on only ONE internal link on a static page which had no other links to the home page).*****

Result:

Well look at the graph below.

Ranking Drop

It did seem to have an impact.

However, that massive drop for months is kind of worrying.

From Jan to July the site was nowhere for the phrase. Although it has just jumped 105 places back to no3 for the test term (which was a geographic location – not a made up word).

This change could be down to other reasons as I said – Google is always tweaking things. Perhaps this ranking drop would not have happened if the keyword was present on the target page.

It’s possible linking to your home page with keyword rich anchor text links (and that link being the ONLY link to the home page on that page) can have some positive impact in your rankings, but it’s also quite possible attempting this might damage your rankings too!

Trying to play with first link priority is for me, a bit too obvious and manipulative these days, so I don’t really bother much, unless with a brand new site, or if it looks natural, and even then not often, but these kinds of results make me think twice about everything I do in SEO.

I shy away from overtly manipulative onsite SEO practices in 2018 – and I suggest you do too.

First Link Priority – Do Multiple Links From One Page To Another Count?

Hi Matt. If we add more than one links from page A to page B, do we pass more PageRank juice and additional anchor text info? Also can you tell us if links from A to A count?

He mentioned something like he ‘wasn’t going to get into anchor text flow’ (or as some call First Link Priority) – in this scenario, which is, actually, a much more interesting discussion.

But the silence on anchor text and priority – or what counts and what doesn’t, is, perhaps, confirmation that Google has some sort of ‘link priority’ when spidering multiple links to a page from the same page and assigning relevance or ranking scores.

For example (and I am talking internally here – if you took a page and I placed two links on it, both going to the same page? (OK – hardly scientific, but you should get the idea). Will Google only ‘count’ the first link? Or will it read the anchor text of both links, and give my page the benefit of the text in both links especially if the anchor text is different in both links? Will Google ignore the second link? What is interesting to me is that knowing this leaves you with a question. If your navigation array has your main pages linked to in it, perhaps your links in content are being ignored, or at least, not valued.

I think links in body text are invaluable. Does that mean placing the navigation below the copy to get a wide and varied internal anchor text to a page? Perhaps.

I’m pretty sure, from plenty of observations I’ve made in the past, that this is indeed the case. I have seen a few examples where I *thought* might contradict my own findings, but on closer examination, most could not be verified. It’s a lot harder today to isolate this sort of thing – but Google is designed that way.

I think as the years go by – we’re supposed to forget how Google works under the hood of all that fancy GUI, BTW.

The simple answer is to expect ONE link – the first link –  out of multiple links on a single page pointing at one other page – to pass anchor text value. Follow that advice with your most important key phrases in at least the first link when creating multiple links and you don’t need to know about first link priority.

A quick SEO test I did a long time ago throws up some interesting questions today – but the changes over the years at Google since I did my test will have an impact what is shown – and the fact is – the test environment was polluted long before now.

I still think about first link priority when creating links on a page.

I Tested If Only The First Link Counts in Google

The last time I tested this was a long time ago.

I thought I would share the results of another simple test I did to see how Google treats internal links.

What does Google count, when it finds two links on the same page going to the same internal destination page.

I surmised:

  1. Google might count one link, the first it finds as it indexes a page
  2. Google might count them all (I think unlikely)
  3. Google might count perhaps 55 characters of ALL of the available links (could be useful)

OK – From this test, and the results on this site anyways, testing links internal to this site, it seems Google only counted the first link when it came to ranking the target page.

In much the same method as my recent SEO test where I tested how many words you should put in a link, I relied on the “These terms only appear in links pointing to this page” (when you click on the cache) that Google helpfully shows when the word isn’t on the page).

Again, I pointed 2 everyday words at a page that don’t appear on the page or in links to the page, and searched for the page in Google using a term I knew it would rank high for (Shaun Anderson) and added my modifier keywords. I left it for some time, and checked every now and again the results.

Google Cache

Searching for “shaun anderson” + “Keyword 1” returned the page (cache shown above).

Cartoonist

Searching for the term “shaun anderson” + “keyword 2” did not return the page at all, only the page with the actual link on it, further down the SERPs.

Fireman

Not even in a site search.

Site Search

It’s not exactly Google terrorism to identify this, so here is the actual test page where you can see the simple test in action.

So today :),  on this site :) in internal links :), Google only counted the first link as far as anchor text transfer is concerned :)

How can you use to your advantage?

  1. Perhaps, you could place your navigation below your text
  2. This lets you vary the anchor text to important internal pages on your site, within the text content, instead of ramming down Google’s throat one anchor text link (usually high in the navigation)
  3. Varying anchor text naturally optimises to an extent the page for long tail ‘human’ searches you might overlook when writing the actual target page text
  4. Of course, I assume links within text surrounded by text are more important than links in navigation menus
  5. It makes use of your internal links to rank a page for more terms, especially useful if you link to your important pages often, and don’t have a lot of incoming natural links to achieve a similar benefit

Works for me anyway, when I’m building new sites, especially useful on longtail searches, and there’s plenty of editorial content being added to the site for me to link to a few sales pages.

Note: I would think Google would analyse everything it finds, so it would find it easy to spot spammy techniques we’ve all seen on sites trying to force Google to take multiple link anchor text to one page.

Does Only The First Link Count On Google?

Does the second anchor text link on a page count?

This is one of the more interesting discussions in the SEO community of late.

Here’s some more on the topic;

  1. You May Be Screwing Yourself With Hyperlinked Headers
  2. Single Source Page Link Test Using Multiple Links With Varying Anchor Text
  3. Results of Google Experimentation – Only the First Anchor Text Counts
  4. Debunked: Only The 1st Anchor Text Counts With Google
  5. Google counting only the first link to a domain – rebunked

I think quite possibly this could change day to day if Google pressed a button, but I optimise a site thinking that only the first link will count – based on what I monitor although I am testing this – and actually, I usually only link once from page-to-page on client sites, unless it’s useful for visitors.

How Does Google Find My Internal Pages?

Traditionally – Google had to find a link to your homepage on another web page – then it crawled all the pages it could find that was linked from your homepage, and so on and so on – until it discovered all your pages on your site.

That was a long time ago. Now – opportunities are endless as to how Google will find your pages – although Googlebot still operates exactly as it always has done. If it finds a link it can crawl, it will crawl it – and index the page it crawls.

Just because Google can find your pages easier in 2018 doesn’t mean you should neglect to build Googlebot a nice architecture with which it can crawl and find all the pages on your website.

Pinging Google blogsearch via RSS (still my favourite way of getting blog posts into Google results fast ) and XML sitemaps may help Google discover your pages, find updated content and include them in search results, but they still aren’t the best way at all of helping Google determine which of your pages to KEEP INDEXED or EMPHASISE or RANK or HELP OTHER PAGES TO RANK (e.g. it will not help Google work out the relative importance of a page compared to other pages on a site, or on the web).

While XML sitemaps go some way to address this, prioritisation in sitemaps does not affect how your pages are compared to pages on other sites – it only lets the search engines know which pages you deem most important on your own site. I certainly wouldn’t ever just rely on xml sitemaps like that….. the old ways work just as they always have – and often the old advice is still the best especially in SEO.

XML sitemaps are INCLUSIVE, not EXCLUSIVE in that Google will spider ANY url it finds on your website – and your website structure can produce a LOT more URLs than you have actual products or pages in your XML sitemap (something else Google may PENALISE YOU FOR).

Keeping your pages in Google and getting them to rank has long been assured by internal linking.

Traditionally, every page needed to be linked to other pages for Pagerank (and other ranking benefits) to flow to other pages – that is traditional, and I think accepted theory, on the question of link equity.

I still think about link equity today – it is still important.

Some sites can still have short circuits – e.g. internal link equity is prevented from filtering to other pages because Google cannot ‘see’ or ‘crawl’ a fancy menu system you’re using – or Googlebot cannot get past some content it is blocked in robots.txt from rendering, crawling and rating.

I still rely on the ‘newer’ protocols like XML sitemaps for discovery purposes, and the old tried and trusted way of building a site with an intelligent navigation system to get it ranking properly over time.

Read my article on how to get Google to index an entire website.

Broken Links Are A Waste Of Link Power

Websites ‘Lacking Care and Maintenance’ Are Rated ‘Low Quality’ by Google.

QUOTE: “Sometimes a website may seem a little neglected: links may be broken, images may not load, and content may feel stale or out-dated. If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” Google Quality Evaluator Guidelines, 2017

The simplest piece of advice I ever read about creating a website / optimising a website was over a decade ago:

QUOTE: “”make sure all your pages link to at least one other in your site””

This advice is still sound in 2018.

Check your pages for broken links.

Broken links are a waste of link power and could hurt your site, drastically in some cases, if a poor user experience is identified by Google. Google is a link based search engine – if your links are broken, you are missing out on the benefit you would get if they were not broken.

Saying that – fixing broken links is NOT a first-order rankings bonus – it is a usability issue, first and foremost.

QUOTE: “The web changes, sometimes old links break. Googlebot isn’t going to lose sleep over broken links. If you find things like this, I’d fix it primarily for your users, so that they’re able to use your site completely. I wouldn’t treat this as something that you’d need to do for SEO purposes on your site, it’s really more like other regular maintenance that you might do for your users.” GOOGLE – 2014 (John Mueller)

How Many Links Is Too Many In A Website Dropdown Navigation System?

I answered in the Google Webmaster Forum a question about how many links in a dropdown are best:

The question was:

QUOTE: “Building a new site with over 5000 product pages. Trying to get visitors to a product page directly from the homepage. Would prefer to use a two-level drop-down on homepage containing 10 brands and 5K products, but I’m worried a huge source code will kick me in the pants.Also, I have no idea how search engines treat javascript links that can be read in HTML. Nervous about looking like a link farm.”

I answered:

QUOTE – “I’d invest time in a solid structure – don’t go for a javascript menu it’s too cumbersome for users. Sometimes google can read these sometimes it can’t – it depends on how the menu is constructed. You also have to remember if google can read it you are going to have a big template core code (boilerplate) on each and every page vying alongside flimsy product information – making it harder for google to instantly calculate what the individual products page is supposed to rank for.

I would go for a much reduced simple sitewide navigation in the menu array,

Home page links to categories > Categories link to products > Products link to related products

when you go to category links the links relevant in that category appear in the menu. Don’t have all that pop down in a dropdown – not good for users at all. Keep code and page load time down to a minimum…” Shaun Anderson

QUOTE:” JohnMu (Google Employee) + 2 other people say this answers the question:” Google Webmaster Forums

I thought seeing as somebody from Google agreed, it was worth posting on my own blog.

The most important thing for me when designing website navigation systems is:

  1. Make it easy for the user to navigate
  2. Make it easy for Google to get to your content and index your pages

In terms of navigation from a landing page (all your pages are potential landing pages) what do you think the benefits are of giving people 5000 navigation options.

Surely if the page meets their requirements, all you need is two buttons. Home, and buy now! OK – a few more – but you get what I mean, I hope.

Less is more.

In terms of site structure – to be honest – I do not think categories in a site structure (on anything but the largest site) helps your product pages OR BLOG PAGES rank BETTER (I mean, where is the evidence for that, really although every SEO in the land tells you that? I have TESTED THIS OVER AND OVER AGAIN – note I don’t have categories on this blog but all my pages rank very well – it’s far more important JUST TO GET AS MANY OF YOUR SITE PAGES INDEXED AS POSSIBLE and RANKING HIGH IN GOOGLE OVER AS MUCH TIME AS POSSIBLE – forget about making your 1,000 products rank better via an internal navigation system by making them more relevant by passing through category or tag pages, just get them to rank with the keyword phrase you want to rank for in your navigation system). And remember first link priority.

Once you realise getting your product pages indexed is the key, don’t go for a mega-menu just because you think this is a quick way to solve your indexing problem.

Again, if you look at the Hobo site, I go for a minimal sitewide navigation system and prefer to use contextual links (links within my content) and links to related pages as a way Google can find content.

The tree system I mentioned above is a good quick and easy system of getting a site like an e-commerce website indexed but never use a mega menu – you don’t need to (why do you want to obscure your content either at any time with a drop-down menu?). I’ve considered in the past too that obscured links in drop down tags (or) could well be devalued by Google (it’s an easy way to hide unimportant links). I’ve not had time to test that thoroughly though.

With a site structure, it’s all about getting your content indexed. That’s it.

Depending on how much Pagerank you have, you might need to ensure you are linking to your product pages you NEED to rank in Google OFTEN FROM MANY PAGES WITH PR to ensure these pages have enough Pagerank to GET INTO GOOGLE’S MAIN INDEX. Think about that. If you can’t be bothered to tell Google what your most important pages are on your site via your own internal navigation structure, why should Google bother ranking it at all or assigning it Pagerank?

I test a lot with this site, but that’s not to say it’s perfect. I’m aware too, I have a decent amount of REAL PAGERANK to play with which many won’t. I don’t think there is a perfect system to follow, just a sensible one.

PS – A basic HTML sitemap is an old friend, and Google actually does say in its guidelines for Webmasters that you should include a sitemap on your site – for Googlebot and users.

Use CSS Drop-Down Navigation Arrays for SEO Friendly Menus

You can create dynamic dropdown menus on your site that meet accessibility requirements and is SEO friendly and then link your pages together in a Google-friendly way.

Just be sure to employ a system that uses CSS and Javascript (instead of pure javascript & HTML tables) and unordered lists as a means of generating the fancy drop-down navigation on your website.

Then, if javascript is disabled, or the style sheet is removed, the lists that make up your navigation array collapses gracefully into a list of simple links.

Just be sure and include that ‘skip links’ link if your lists are long or repeated page-to-page and appear above the content.

Remember, with Drop down menus:

  • Drop-down menus are generally fine but the JavaScript triggering them can cause some problems for users with screen readers and screen magnifiers.
  • A <noscript> alternative is necessary.
  • The options offered in a drop-down should be repeated as text links on the same page, so use unordered lists with CSS to develop your menu.

Use a “Skip Navigation” link on large mega menu systems

Add a skip navigation link that brings the reader straight down to the main content of the page if you have a large menu system on every page. This allows users to skip the navigation array and get immediately to the page content. You won’t want this on your visually rich page, so some simple CSS will sort this out. You can hide it from the visual browsers, but it will display perfectly in text and some speech browsers.

I don’t like mega menu systems for websites, at all. Too many options promotes indecision.

Does Google Count Keywords In The URL As A Ranking Signal?

QUOTEI believe that is a very small ranking factor.” John Mueller, Google 2016

Do you need keywords in URLs to rank high in Google in 2018?

No.

While I wouldn’t necessarily rip apart a site structure JUST to change URLs, it is the case that often poorly thought out website page URLs are a sign of other, less obvious sloppy SEO, and if a big clean up is called for, then I would consider starting from what is a basic SEO best practice and use search engine friendly URLs.

How To Do Internal Link Building in 2018

As mentioned previously, this will depend on the size and complexity of your website. A very large site should keep things simple as possible and avoid any keyword stuffing footprint.

Any site can get the most out of internal link building by descriptively and accurately linking to canonical pages that are very high quality. The more accurately described the anchor text is to the page linked to, the better it is going to be in the long run. That accuracy can be to an exact match keyword phrase or a longtail keyword variation of it (if you want to know more see my article on keyword research for beginners – that link is in itself an example of a long tail variation of the primary head or medium term ‘keyword research‘).

I silo any relevance or trust mainly through links in a flat architecture in text content and helpful secondary menu systems and only between pages that are relevant in context to one another.

I don’t worry about perfect Pagerank siloing techniques in 2018.

On this site, I like to build in-depth content pieces in 2018 that ranks for a lot of long-tail phrases. These days, I usually would not want those linked from every page on a site – because this practice negates the opportunities some internal link building provide. I prefer to link to pages in context; that is, within page text.

There’s no set method I find works for every site, other than to link to related internal pages often and where appropriate. NOTE: You should also take care to manage redirects on the site, and minimise the amount of internal 301 redirects you employ on the site; it can slow your pages down (and website speed is a ranking factor) and impact your SEO in the long-term in other areas.

How Can I Optimise Anchor Text Across A Website?

Optimising your anchor text across an entire site is actually a very difficult and time-consuming process. It takes a lot of effort to even analyse internal anchor text properly.

You can use tools like SEMRush (specifically the SEMRush Audit Tool), SiteBulb CrawlerDeepCrawlScreaming Frog or SEO Powersuite Website Auditor to check the URL structure and other elements like anchor text on any site sitewide.

If you are not technically minded, we can analyse and fix your page titles for you, if necessary, as part of our fixed price SEO service.


  • Best SEO Tools

    There are many SEO tools available today that make analysing the site structure of a website a much simpler endeavour than it was when I started out in SEO 20 years ago.

    See Tools