Is The Guardian having problems with its domain name migration?

The Guardian migration: drop in traffic? 

The Telegraph recently reported that the domain name change has led to difficulties, with traffic dropping as a result. It also mentioned that ‘senior executives were seeking urgent help from Google’. 

The Guardian itself acknowledges this problem in a recent report on the latest digital ABC figures

Guardian News & Media’s website network reported a 5.83% month-on-month dip in daily browsers to 4,519,849, as a transition to a new global domain, theguardian.com, affected traffic. Monthly browser numbers fell 1.56% to 83,519,898.

Our Research Analyst Andrew Warren-Payne has been looking at the number in Majestic. This first chart shows the results for the old .co.uk domain: 

You can see the difference when we look at guardian.com, with a big drop in the number of backlinks and referring domains: 

There is a caveat here, as pointed out below, as the data reported by Majestic may not yet be fully up to date. 

Joost de Valk, who has been helping with the migration, isn’t too concerned:

We’re not concerned about traffic at all, numbers are looking good and in fact, ahead of expectations. Migrations like these always take some time to settle in and may sometimes be a rough ride.

So far though, I can honestly say it’s looking good and better than I’ve ever seen a migration of this size – if there ever was such a thing – turn out. Of course there are still things to fix, as there always will be on a site this size, but we’re well aware of our issues and fixing those.

I’ve been asking search experts for their views on the migration, and whether Google should be making this easier for sites…

Are domain migrations such as this supposed to be easier now? Is it not possible to ‘tell’ Google what you are up to?

Kevin Gibbons, UK MD at BlueGlass Interactive:

There are more ways than ever of implementing site migrations, and the more signals you can send Google that your website has moved, the better. In theory at least!

However, in practice domain migrations are still always a risk and you have to accept that there will be short-term loses and unexpected problems. Even if you’ve done this hundreds of times before, Google is an unpredictable beast at the best of times!

You really have to make the decision having carefully balanced out your options and considered the short term loss vs the long term gain, which I’m sure is what happened in this case.

Dan Barker, EBusiness Consultant: 

Funnily enough, you can tell Google what you’re up to. There is a ‘change of address’ tool within Webmaster Tools. Domain migrations don’t always go entirely smoothly, but they are definitely far more reliable than they were 10 years ago.

Julia Logan, SEO consultant (better known as Irish Wonder): 

In case of a large site it is totally up to Google how fast it would fully reindex the new site instead of the old one. Domain migrations of large sites have always been problematic due to sheer volume of the site.

Has the Guardian done something wrong? If so, what?

Kevin Gibbons:

Not necessarily. It’s very difficult to judge from an external perspective especially at an early stage where progress is unclear, as there’s always different methods, reasons and goals when implementing changes like this.

Yes, it does appear that the Guardian has lost significant UK organic visibility and market share since the domain migration: 

However, it has started to make traction in the US which is important to remember as a key goal in the move: 

I suspect that providing the UK improves back to where it was in the medium-term (rather than long) and the US organic traffic continues to rise, the short-term loss will be outweighed by the overall long-term gains.

Dan Barker: 

I think it’s actually done pretty well. The paper had four options really:

  1. Stick with the old .co.uk domain and try and make a go of it internationally.
  2. Launch global content on a different domain to the .co.uk, keeping the UK-centric content where it was.
  3. Relaunch on a more internationalised domain name.
  4. Launch sites for individual regions one at a time.

The Guardian had tried the fourth of those to an extent previously, and obviously decided now was the time to go big bang.

The timing couldn’t have been better, as it had the enormous series of NSA/Prism splashes which drove both a lot of traffic (nullifying any loss in ad revenue that domain migration might cause), but also drove lots of social shares, links, etc to help support organic search results in the longer term.

Quietly, it’s also made the site adapt to the device you’re viewing it from. Try loading the same URL on a phone and on your desktop to take a look.

There are a few odd bits & bobs in there, for example you can see there’s piles of ancient stuff lying around still on guardian.co.uk with links that could have been redirected somewhere.

For example: http://education.guardian.co.uk/universityguide2005http://blogs.guardian.co.uk/quran/, and so on. 

Lots of this is terribly old, legacy content, but it has all picked up valuable links over the years, and would be worth either moving and redirecting as-is, or killing off and redirecting the URLs to somewhere as appropriate as possible.

There are some really odd 302 redirects too. For example m.guardian.co.uk, which itself has a pile of links, ‘temporarily’ redirects to www.theguardian.com. That doesn’t really make sense to me as they’ve standardised on www as far as I’m aware, but perhaps there is a logic to it I’m unaware of.

Another really odd one is http://witness.guardian.co.uk which is a fairly new project that’s now being ‘temporarily’ redirected to http://witness.theguardian.com. Again, perhaps that project is being shelved or there’s some other logic I’m unaware of. 

Most of these temporary redirects are 302, but there are a couple of 303s among them. That usually either means someone’s being really clever, or that a maverick developer has gone crazy.

Julia Logan: 

I am not sure what it was trying to achieve but from what I see, the old domain still has more than 6m URLs in Google’s index, mainly content on subdomains of the old site (e.g. blogs.guardian.co.uk, travel.guardian.co.uk, etc.).

Moreover, there is no reason for Google to deindex them as at least some of them are still alive and not redirecting to the new domain. From what I can see, only the main domain has been redirected but not all the subdomains.

Whether or not this has been the Guardian’s goal, I have no idea but looks like it’s not making things any easier.

All the links on the old subdomains that currently are alive still point to the old domain version, even for pages that have been moved to the new domain. That surely sends Google a dubious signal.

What, if anything, can the Guardian do to fix it?

Kevin Gibbons:

It would require knowing the goals from the domain migration. Is the Guardian happy with where things are at this stage?

I think it’s clear that short-term there were certainly better options, one being to stick with guardian.co.uk for the UK site and then implementing an option such as hreflang for additional territories or even canonical tags to run the UK content side by side.

This seems like a logical solution which places less risk on the UK and allows them to build domain authority on theguardian.com in the meantime, leaving the option to make the full switch in the future.

It has also applied a geo-detection redirect based on where you are accessing the site from, which can often be tricky to get right with SEO, but again this is very easy to say without an inside view, as there’s always technological challenges and internal thought processes behind these decisions.

That may have worked before, however a roll-back at this stage seems very extreme and decisions like this aren’t made on short-term gains, quite the opposite. Plus it’s unlikely that this purely for SEO reasons, it’s an online branding move in many ways too and allows them to enter new global territories, specifically the US.

So while the above option makes more SEO sense in my opinion, it may not have been practical in this scenario.

Obviously everyone always wants the best case scenario, but you also have to be prepared for the worst. If it has analysed the situation and is confident that the best decision has been made in the mid to long term, the answer is likely to be patience.

Dan Barker: 

I’d be surprised if the Guardian isn’t already on it, but it would be worth going through all those legacy subdomain.guardian.co.uk assets figuring out whether to ditch them, move them, or simply redirect the URLs to somewhere more appropriate.

It has a fairly small but excellent team over there, and the paper brought in @yoast to help out with the migration on top of the already excellent in-house/contract people.

To be honest, I’m not sure the Majestic data does tell us there’s been an enormously big issue. It’s tough to tell without actual webmaster tools or sitecatalyst access. If I remember right Majestic’s fresh index covers a 90 day period, and The Guardian domain migration only happened a couple of months ago.

If you look at the link flow & citation flow metrics too, you’ll notice those are actually higher for theguardian.com. Majestic SEO is a wonderful tool, one I pay for every month and would recommend to anyone, but I wouldn’t take the two reports here as an indicator that the wheels have come off.

Some axle alignment would be useful, but, to overstretch an already tedious analogy, the car is still a mostly-valeted Dodge Ram with neon chassis lighting, a fresh christmas tree air freshener, and a tank 3/4 full of diesel.

As a side-note, if you take a look at the Doubleclick Display Ad Planner stats for theguardian.com you’ll see it’s listed at a fairly healthy 25-30m impressions a week vs the rather tragic 500k-1m impressions thesun.co.uk shows following the erection of its paywall. 

Julia Logan: 

If the purpose has been to just move everything completely to the new domain, there is definitely a lot of work still remaining to be done.

Check everything that hasn’t been redirected, redirect it, double check again. If, on the other hand, it did intend to leave some of the old subdomains alive, the paper needs to fix all the links from those pages to the content that has moved to the new domain.

Is domain migration a common problem? Could Google do more to help here?

Kevin Gibbons:

You have to appreciate that Google’s in a very difficult position here. The easy answer is it can pass signals across from redirects quicker so that the new domains rank quicker.

The difficulty is that if it does that, SEOs will start to exploit it, meaning that people will overuse the tactic of redirecting purchased authority sites into new domains for SEO gains.

So actually what it is doing in making this difficult, whilst frustrating for legitimate redirects and brands such as The Guardian, does make sense.

You have to build a certain element of trust and reputation in your own right, as a new domain, in order to gain Google’s full trust. Fortunately as a huge publisher, the Guardian is in about as strong a position as you can get in order to be able to recover from this. 

It will certainly be interesting to see how this develops over the coming months, but personally despite what appears to be frustrating results, so far I wouldn’t see a need for a kneejerk reaction.

Dan Barker: 

It’s less of an issue than it was years ago, but any big technical changes are always a risk. Bigger problems now are usually either site relaunches where URLs change with no direct one-to-one mapping, or internationalisation where suddenly a single-country site is split into different regions and languages.

Google has tools and notes on most of these areas, so much of the battle is simply knowing there’s likely to be a problem, and knowing where to look to begin mitigating the risks. 

Outside of that though, and perhaps a problem with knock-ons in other areas, is that social networks don’t make it easy to retain social share history on domains.

For example, take a look at this article which has 2,700+ comments, but very low tweets & facebook shares.

It’s worth noting that the Google Plus number has updated, indicating that its taking a bit more interest in this area than Facebook, Twitter, and the other big social networks.

Julia Logan: 

Google could certainly do better in terms of dropping pages no longer alive and switching to the new domain instead, if told so explicitly by the verified domain owner.

However, with large sites, there is a tremendous amount of crawling involved, so surely that takes time. 

What do you think? Is The Guardian suffering unduly as a result of a standard domain name migration? Do you think Google should do more to assist? Let us know below…

Study Disputes “Bing It On” Claim That 2:1 Prefer Bing To Google

A new study appearing on the”Freakonomics” blog aggressively disputes the claim that people prefer Bing to Google and especially the statistical contention that they prefer the search engine over Google 2:1. In an article explaining the study, law and economics professor Ian…

Please visit Search Engine Land for the full article.

Perspective Matters In B2B Website Content

In B2B marketing, relationships lead to conversions. Content strategy is an essential tool for driving traffic through SEO and social media, and for engaging and building trust with audiences online. This means choosing the best content type (e.g., cop…

Time For A Content Audit

“Content is king” is one of those “truthy” things some marketers preach. However, in most businesses the bottom line is king, attention is queen, and content can be used as a means to get both, but it depends.

The problem is that content is easy to produce. Machines can produce content. They can tirelessly churn out screeds of content every second. Even if they didn’t, billions of people on the internet are perfectly capable of adding to the monolithic content pile at similar rates.

Low barriers to content production and distribution mean the internet has turned a lot of content into near worthless commodity. Getting and maintaining attention is the tricky part, and once a business has that, then the benefits can flow through to the bottom line.

Some content is valuable, of course. Producing valuable content can earn attention. The content that gets the most attention is typically something for which an audience has a strong need, yet can’t easily get elsewhere, and is published in a place they’re likely to see. Or someone they know is likely to see. An article on title tags will likely get buried. An article on the secret code to cracking Google’s Hummingbird algorithms will likely crash your server.

Up until the point everyone else has worked out how to crack them, too, of course.

What Content Does The User Want?

Content can become King if the audience bestows favor upon it. Content producers need to figure out what content the audience wants. Perversely, Google have chosen to make this task even more difficult than it was before by withholding keyword data. Between Google’s supposed “privacy” drive, Hummingbird supposedly using semantic analysis, and Penguin/Panda supposedly using engagement metrics, page level and path level optimization are worth focusing upon going forward.

If you haven’t done one for a while, now is probably a good time to take stock and undertake a content audit.

You Have Valuable Historical Information

If you’ve got historical keyword data, archive it now. It will give you an advantage over those who follow you from this point on. Going forward, it will be much more expensive to acquire this data.

Run an audit on your existing content. What content works best? What type of content is it? Video? Text? What’s the content about? What keywords did people use to find it previously? Match content against your historical keyword data.

Here’s a useful list of site and content audit tools and resources.

If keywords can no longer suggest content demand, then how do we know what the visitor wants in terms of content? We must seek to understand the audience at a deeper level. Take a more fuzzy approach.

Watch Activity Signals

Analytics can get pretty addictive and many tools let you watch what visitors do in real time. Monitor engagement levels on your pages. What is a user doing on that page? Are they reading? Contributing? Clicking back and forward looking for something else?

Ensure pages with high engagement are featured prominently in your information architecture. Relegate or fix low-engagement pages. Segment out your content so you know which is the most popular, in terms of landings, and link that information back to ranking reports. This way, you can approximate keywords and stay focused on the content users find most relevant and engaging. Segment out your audience, too. Different visitors respond to different things. Do you know which group favours what? What do older people go for? What do younger people go for? Here are a few ideas on how to segment users.

User behavior is getting increasingly complex. It takes multiple visits to purchase, from multiple channels/influences. Hence the addition of user segmentation allows us to focus on people. (For these exact reasons multi-channel funnels analysis and attribution modeling are so important!)
At the moment in web analytics solutions, people are defined by the first party cookie stored on their browser. Less than ideal, but 100x better then what we had previously. Over-time as we all expand to Universal Analytics perhaps we will have more options to track the same person, after explicitly asking for permission, across browsers, channels and devices

In-Site Search

If Google won’t give you keywords, build your own keyword database. Think about ways you can encourage people to use your in-site search. Watch the content they search for and consume the most. Another way of looking at site search is to provide navigation links that emphasize different keywords terms. For example, you could place these high up on your page, with each offering a different option relating to related keyword terms. Take a note of which keyword terms visitors favour over others.

In the good old days, people dutifully used site navigation at the left, right, or top of a website. But, two websites have fundamentally altered how we navigate the web: Amazon, because the site is so big, sells so many things, and is so complicated that many of us go directly to the site search box on arrival. And Google, which has trained us to show up, type what we want, and hit the search button. Now when people show up at a website, many of them ignore our lovingly crafted navigational elements and jump to the site search box. The increased use of site search as a core navigation method makes it very important to understand the data that site search generates

Distribution

Where does attention flow from? Social media? A mention is great, but if no attention flows over that link to your content, then it might be a misleading metric. Are people sharing your content? What topics and content gets shared the most?

Again, this comes back to understanding the audience, both what they’re talking about and what actions they take as a result. In “Digital Marketing Analytics: Making Sense Of Consumer Data”the authors recommend creating a “learning agenda”. Rather than just looking for mentions and volume of mentions, focus on specific brand or service attributes. Think about the specific questions you want answered by visitors as if they those visitors were sitting in front of you.

For example, how are consumers reacting to prices in your niche? What are their complaints? What do they wish would happen? Are people talking negatively about something? Are they talking positively about something? Who are the new competitors in this space?

Those are pretty rich signals. We can then link this back to content by addressing those issues within our content.

Categories: 

Hot Tactics For Geo-Targeted Ads On Google & Bing

At SMX East this morning, I sat in on the Improving Your Geographic Targeting Tactics On Google & Bing session, in large part because I’m far more familiar with SEO than with paid search, so I like to stretch myself and learn more about the advertising side of the local search equation….

Please visit Search Engine Land for the full article.

Maps Roundup: Google Avoids Waze Antitrust Fight, Scout Social Mapping, deCarta Route Patent

Google spent just over $1 billion to take Waze away from Facebook, Apple or Nokia and Microsoft. It equally sought the real-time driver feedback that Waze crowd-sourcing provides. Given the size of the transaction the US FTC got immediately involved to…