Using Analytics To Enhance Your Link Building Strategy

Like everyone else, I’m not psyched that Google took away my organic keyword data. It’s pretty uncool and, as a marketer, it makes my life a bit harder. However, analytics is much more than keywords; and, there’s a lot of great information in there that can help drive better…

Please visit Search Engine Land for the full article.

Defensive SEO is all you need to know

Some of those who dole out advice around search are offering something more along the lines of a sportscar with one of those snazzy 90s devices that let you detect speed cameras and a map of shortcuts.

The thing is, those speed cameras are just automated systems set up by the real people who make the rules, the police themselves. And they won’t stop you from being pulled over and reprimanded by the real deal. The law always catches up with you.

Call me naive, but when I hear the police say they have society’s best interests at heart, I believe them. And I believe the same thing when it comes to Google. It genuinely intends to create algorithms that surface the most relevant, timely content.

So, trying to play it smart and get manipulative to beat the system playing by your own rules is, at best, a short term fix.

Instead, my advice to anyone creating content online today is that the most important thing is to have an understanding of your vehicle and make sure you don’t run into any easy mistakes.

Practice ‘defensive SEO’. Knowledge will empower you more than paying some goon thousands to work their black magic behind the scenes. And with that knowledge, you can make sure you hire advisors with the same long term attitude.

Giving examples of defensive SEO becomes a little tricky, because it’s largely about acting normal. Such as:

  • Don’t obsess about cramming paragraphs with keywords but do consider consistency between an article’s title, category and tags.
  • Don’t binge across the web getting meaningless guest posts around content that effectively says nothing.
  • Write for your mate’s blog because you have something on your mind or put together an article worthy of a proper publication and pitch it in to them.
  • Work on your *ideas* strategy.

I’m not saying all dedicated SEO professionals fit into one category or the other. As with PR or any other industry, they’re a mixed bunch and often it’s about finding the one with the skillset for you. And they may drive you into lamp posts sometimes like my instructor.

But ultimately, if you have the right assistance, combined with your own understanding of how to drive the thing, you should be able to pass the test.

How to Build the Moz Keyword Difficulty Tool in Excel with Moz and SEMrush

How exciting is that? By combining SEMrush’s “phrase_organic” with Moz’s URL Metrics API call, you can cobble together a prototype keyword difficulty tool. My example in this video is a bit scrappy, but I’m sure you get the point: Check out the rest of our instructional videos on your Youtube channel: http://www.youtube.com/user/SEOgadget No SEMrush Account? If […]

The post How to Build the Moz Keyword Difficulty Tool in Excel with Moz and SEMrush appeared first on SEOgadget.

Optimizing The SEO Model

SEO has always been focused on acquisition.

The marketing strategy, based on high rankings against keyword terms, is about gaining a steady flow of new visitors. If a site ranks better than competing sites, this steady stream of new visitors will advantage the top sites to the disadvantage of those sites beneath it.

The selling point of SEO is a strong one. The client gets a constant flow of new visitors and enjoys competitive advantage, just so long as they maintain rank.

A close partner of SEO is PPC. Like SEO, PPC delivers a stream of new visitors, and if you bid well, and have relevant advertisements, then you enjoy a competitive advantage. Unlike PPC, SEO does not cost per click, or, to be more accurate, it should cost a lot less per click once the SEOs fees are taken into account, so SEO has enjoyed a stronger selling point. Also, the organic search results typically have a higher level of trust from search engine users.

91% prefer using natural search results when looking to buy a product or service online”.[Source: Tamar Search Attitudes Report, Tamar, July 2010]

Rain On The Parade

Either by coincidence or design, Google’s algorithm shifts have made SEO less of a sure proposition.

If you rank well, the upside is still there, but because the result is less certain than it used to be, and the work more involved than ever, the risk, and costs in general, have increased. The more risky SEO becomes in terms of getting results, the more Adwords looks attractive, as at least results are assured, so long as spend is sufficient.

Adwords is a brilliant system. For Google. It’s also a brilliant system for those advertisers who can find a niche that doesn’t suffer high levels of competition. The trouble is competition levels are typically high.

Because competition is high, and Adwords is an auction model, bid prices must rise. As bid prices rise, only those companies that can achieve ROI at high costs per click will be left bidding. The higher their ROI, the higher the bid prices can conceivably go. Their competitors, if they are to keep up, will do likewise.

So, the PPC advertiser focused on customer acquisition as a means of growing the company will be passing more and more of their profits to Google in the form of higher and higher click prices. If a company wants to grow by customer acquisition, via the search channel, then they’ll face higher and higher costs. It can be difficult to maintain ROI via PCC over time, which is why SEO is appealing. It’s little wonder Google has their guns pointed at SEO.

A fundamental problem with Adwords, and SEO in general, is that basing marketing success around customer acquisition alone is a poor long term strategy.

More on that point soon….

White-Hat SEO Is Dead

It’s surprising a term such as “white hat SEO” was ever taken seriously.

Any attempt to game a search engine’s algorithm, as far as the search engine is concerned, is going to be frowned upon by the search engine. What is gaming if it’s not reverse engineering the search engines ranking criteria and looking to gain a higher rank than a site would otherwise merit? Acquiring links, writing keyword-focused articles, for the purpose of gaining a higher rank in a search engine is an attempt at rank manipulation. The only thing that varies is the degree.

Not that there’s anything wrong with that, as far as marketers are concerned.

The search marketing industry line has been that so long as you avoided “bad behaviour”, your site stood a high chance of ranking well. Ask people for links. Find keywords with traffic. Publish pages focused on those topics. There used to more certainty of outcome.

If the outcome is not assured, then so long as a site is crawlable, why would you need an SEO? You just need to publish and see where Google ranks you. Unless the SEO is manipulating rank, then where is the value proposition over and above simply publishing crawlable content? Really, SEO is a polite way of saying “gaming the system”.

Those who let themselves be defined by Google can now be seen scrambling to redefine themselves. “Inbound marketers” is one term being used a lot. There’s nothing wrong with this, of course, although you’d be hard pressed to call it Search Engine Optimization. It’s PR. It’s marketing. It’s content production. The side effect of such activity might be a high ranking in the search engines (wink, wink). It’s like Fight Club. The first rule of Fight Club is……

A few years back, we predicted that the last SEOs standing would be blackhat, and that’s turned out to be true. The term SEO has been successfully co-opted and marginalized. You can still successfully game the system with disposable domains, by aggressively targeting keywords, and buying lot of links and/or building link networks, but there’s no way that’s compliant with Google’s definitions of acceptable use. It would be very difficult to sell that to a client without full disclosure. Even with full disclosure, I’m sure it’s a hard sell.

But I digress….

Optimization In The New Environment

The blackhats will continue on as usual. They never took direction from search engines, anyway.

Many SEOs are looking to blend a number of initiatives together to take the emphasis off search. Some call it inbound. In practice, it blends marketing, content production and PR. It’s a lot less about algo hacking.

For it to work well, and to get great results in search, the SEO model needs to be turned on its head. It’s still about getting people to a site, but because the cost of getting people to a site has increased, every visitor must count. For this channel to maintain value, then more focus will go on what happens after the click.

If the offer is not right, and the path to that offer isn’t right, then it’s like having people turn up for a concert when the band hasn’t rehearsed. At the point the audience turns up, they must deliver what the audience wants, or the audience isn’t coming back. The bands popularity will quickly fade.

This didn’t really matter too much in the past when it was relatively cheap to position in the SERPs. If you received a lot of slightly off-topic traffic, big deal, it’s not like it cost anything. Or much. These days, because it’s growing ever more costly to position, we’re increasingly challenged by the “growth by acquisition” problem.

Consider optimizing in two areas, if you haven’t already.

1. Offer Optimization

We know that if searchers don’t find what they what, they click back. The click back presents two problems. One, you just wasted time and money getting that visitor to your site. Secondly, it’s likely that Google is measuring click-backs in order to help determine relevancy.

How do you know if your offer is relevant to users?

The time-tested way is to examine a couple of the 4ps. Product, price, position, and place. Place doesn’t matter so much, as we’re talking about the internet, although if you’ve got some local-centric product or service, then it’s a good idea to focus on it. Promotion is what SEOs do. They get people over the threshold.

However, two areas worth paying attention to are product and price. In order to optimize product, we need to ask some fundamental questions:

  • Does the customer want this product or service?
  • What needs does it satisfy? Is this obvious within a few seconds of viewing the page?
  • What features does it have to meet these needs? Are these explained?
  • Are there any features you’ve missed out? Have you explained all the features that meet the need?
  • Are you including costly features that the customer won’t actually use?
  • How and where will the customer use it?
  • What does it look like? How will customers experience it?
  • What size(s), color(s) should it be?
  • What is it to be called?
  • How is it branded?
  • How is it differentiated versus your competitors?
  • What is the most it can cost to provide, and still be sold sufficiently profitably?

SEOs are only going to have so much control over these aspects, especially if they’re working for a client. However, it still pays to ask these questions, regardless. If the client can’t answer them, then you may be dealing with a client who has no strategic advantage over competitors. They are likely running a me-too site. Such sites are difficult to position from scratch.

Even older sites that were at one point highly differentiated have slid into an unprofitable me too status as large sites like Amazon & eBay offer a catalog which grows deeper by the day.

Unless you’re pretty aggressive, taking on me-too sites will make your life difficult in terms of SEO, so thinking about strategic advantage can be a good way to screen clients. If they have no underlying business advantage, ask yourself if you really want to be doing SEO for these people?

In terms of price:

  • What is the value of the product or service to the buyer?
  • Are there established price points for products or services in this area?
  • Is the customer price sensitive? Will a small decrease in price gain you extra market share? Or will a small increase be indiscernible, and so gain you extra profit margin?
  • What discounts should be offered to trade customers, or to other specific segments of your market?
  • How will your price compare with your competitors?

Again, even if you have little or no control over these aspects, then it still pays to ask the questions. You’re looking for underlying business advantage that you can leverage.

Once we’ve optimized the offer, we then look at conversion.

2. Conversion Optimization

There’s the obvious conversion most search marketers know about. People arrive at a landing page. Some people buy what’s on offer, and some leave. So, total conversions/number of views x 100 equals the conversion rate.

However, when it comes to SEO, it’s not just about the conversion rate of a landing page. Unlike PPC, you don’t have precise control over the entry page. So, optimizing for conversion is about looking at every single page on which people enter your site, and optimizing each page as if it were an entry point.

What do you want people to do when they land on your page?

Have a desired action in mind for every page. It might be a sign-up. It might be to encourage a bookmark. It might be to buy something. It might be to tweet. Whatever it is, we need to make the terms of engagement, for the visitor, clear for each page – with a big, yellow highlight on the term “engagement”! Remember, Google are likely looking at bounce-back rates. So, there is a conversion rate for every single page on your site, and they’re likely all different.

Think about the shopping cart process. Is a buyer, particularly a mobile buyer, going to wade through multiple forms? Or could the sale be made in as few clicks as possible? Would integrating Paypal or Amazon payments lift your conversion rates? What’s your site speed like? The faster, the better, obviously. A lot of conversion is about streamlining things – from processes, to navigation to site speed.

At this point, a lot of people will be wondering how to measure and quantify all this. How to track track conversion funnels across a big site. It’s true, it’s difficult. It many cases, it’s pretty much impossible to get adequate sample sizes.

However, that’s not a good reason to avoid conversion optimization. You can measure it in broad terms, and get more incremental as time goes on. A change across pages, a change in paths, can lead to small changes on those pages and paths, even changes that are difficult to spot, but there is sufficient evidence that companies who employ conversion optimization can enjoy significant gains, especially if they haven’t focused on these areas in the past.

While you could quantify every step of the way, and some companies certainly do, there’s probably a lot of easy wins that can be gained merely by following these two general concepts – optimizing the offer and then optimizing (streamlining) the pages and paths that lead to that offer. If something is obscure, make it obvious. If you want the visitor to do something, make sure the desired action is writ-large. If something is slow, make it faster.

Do it across every offer, page and path in your site and watch the results.

Categories: 

Quick Guide to Scaling Your Authorship Testing with Screaming Frog

Posted by kanejamison

Nearly all of us have used Screaming Frog to crawl websites. Many of you have probably also used Google’s Structured Data Testing Tool (formerly known as the Rich Snippet Testing Tool) to test your authorship setup and other structured data.

This is a quick tutorial on how to combine these two tools to check your entire website for structured data such as Google Authorship and Rel=”Publisher”, along with various types of Schema.org markup.


The concept:

Google’s structured data tester uses the URL you’re testing right in their own URL. Here’s an example:

We can take advantage of that URL structure to create a list of URLs we want to test for structured data markup, and process that list through Screaming Frog.

Why this is better than simply crawling your site to detect markup:

You could certainly crawl your site and use Screaming Frog’s custom filters to detect things like rel=”author” and ?rel=author within your own code. And you should.

This approach will tell you what Google is actually recognizing, which can help you detect errors in implementation of authorship and other markup.

Disclaimer: I’ve encountered a number of times when the Structured Data Testing Tool reported a positive result for authorship implementation, but authorship snippets in search results were not functioning. Upon further review, changing the implementation method resolved the issue. Also, authorship may not be granted or present for a particular Google+ user. As a result, it’s important to note that the Structured Data Tester isn’t perfect and will produce false positives, but it will suit our need in this case, quickly testing a large number of URLs all at once.


Getting started

You’re going to need a couple things to get started:

  1. Screaming Frog with a paid license (we’ll be using custom filters which are only available in the paid version)
  2. One of the following: Excel 2013, URL Tools for Excel, or SEO Tools for Excel (any of these three will allow us to encode URLs inside of Excel with a formula)
  3. Download this quick XLSX template: Excel Template for Screaming Frog and Snippet Tester.xlsx

The video option

This short video tutorial walks through all eight steps outlined below. If you choose to watch the video, you can skip straight to the section titled “Four ways to expand this concept.”

Steps 1, 2, and 3: Gather your list of URLs into the Excel template

You can find the full instructions inside the Excel template, but here’s the simple 1-2-3 version of how to use the Excel template (make sure URL Tools or SEO Tools is installed before you open this file or you’ll have to fix the formula):

Steps 123 - Using the Excel Template

Step 4: Copy all of the URLs in Column B into a .txt file

Now that Column B of your spreadsheet is filled with URLs that we’ll be crawling, copy and paste that column into a text file so that there is one URL per line. This is the .txt file that we’ll use in Screaming Frog’s list mode.

Step 4 - Paste into TXT File

Step 5: Open up Screaming Frog, switch it to list mode, and upload your file

List Mode, Activate!

Step 6: Set up Screaming Frog custom filters

Before we go crawling all of these URLs, it’s important that we set up custom filters to detect specific responses from the Structured Data Testing Tool.

Custom Filters for Screaming Frog

Since we’re testing authorship for this example, here are the exact pieces of text that I’m going to tell Screaming Frog to track:

  1. Authorship is working for this webpage.
  2. rel=author markup has successfully established authorship for this webpage.
  3. Page does not contain authorship markup.
  4. Authorship is not working for this webpage.
  5. The service you requested is currently unavailable.

Here’s what the filters look like when entered into Screaming Frog:

Customize All The Filters! Or at least these 5...

Just to be clear, here’s the explanation for each piece of text we’re tracking:

  1. The first filter checks for text on the page confirming that authorship is set up correctly.
  2. The second filter reports the same information as filter 1. I’m adding both of them for redundancy; we should see the exact same list of pages for custom filters 1 and 2.
  3. The third filter is to detect when the Structured Data Testing Tool reports no authorship found on the page.
  4. The fourth filter is to detect when broken authorship is detected. (Typically because either the link is faulty or the Google+ user has not acknowledged the domain in the “Contributor To” section of their profile).
  5. The fifth filter contains the standard error text for the structured data tester. If we see this, we’ll know we should re-spider those URLs.

Here’s the type of text we’re detecting on the Structured Data Tester. The two arrows point to filters 3 and 4:

Your Authorship is Broken

Step 7: Let ‘er rip

At this point we’re ready to start crawling the URLs. Out of respect for Google’s servers and to avoid them disabling our ability to crawl URLs in this manner, you might consider adjusting your crawl rate to a slower pace, especially on large sites. You can adjust this setting in Screaming Frog by going to Configuration > Speed, and decreasing your current settings.

Step 8: Export your results in the Custom tab

Once the crawl is finished, go to the Custom tab, select each filter that you tested, and export the results.

Export!

Wrapping it up

That’s the quick and dirty guide. Once you export each CSV, you’ll want to save them according to the filters you put in place. For example, my filter 3 was testing for pages that contained the phrase “Page does not contain authorship markup.” So, I know that anything that is exported under Filter 3 did not return an authorship result in the Structured Data Testing Tool.


Four ways to expand this concept:

1: Use a proper scraper to pull data on multiple authors

Screaming Frog is an easy tool to do quick checks like the one described in this tutorial, but unfortunately it can’t handle true scraping tasks for us.

If you want to use this method to also pull data such as which author is being verified for a given page, I’d recommend redesigning this concept to work in Outwit Hub. John-Henry Scherck from SEOGadget has a great tutorial on how to use Outwit for basic scraping tasks that you should read if you haven’t used the software before.

For the more technical among us, there are plenty of other scrapers that can handle a task like this – the important part is understanding the process so you can use it in your tool of choice.

2: Compare authorship tests against ranking results and estimated search volume to find opportunities

Imagine you’re ranking 3rd for a high-volume search term, and you don’t have authorship on the page. I’m willing to bet it would be worth your time to add authorship to that page.

Use hlookups or vlookups in Excel to compare data from three tabs: rankings, estimated search volume, and whether or not authorship is present on the page. It will take some data manipulation, but in the end you should be able to create a Pivot Table that filters out pages with authorship already, and sorts the pages by estimated search volume and current ranking.

Note: I’m not suggesting you add Authorship to everything—not every page should be attributed to an author—e-commerce product pages, for example.

3: Use this method to test for other structured markup besides authorship

The Structured Data Testing Tool goes far beyond just authorship. Here’s a short list of other structured markup you can test:

4: Blend this idea with Screaming Frog’s other capabilities

There’s a ton of ways to use Screaming Frog. Aichlee Bushnell at SEER did a great job of cataloging 55+ Ways To Use Screaming Frog. Go check out that post and I’m sure you can come up with additional ways to spin this concept into something useful.


Not to end on a dull note, but a couple comments on troubleshooting:

  1. If you’re having issues, the first thing to do is manually test the URLs you’re submitting and make sure there weren’t any issues caused during the Excel steps. You can also add “Invalid URL or page not found.” as one of your custom filters to make sure that the page is loading correctly.
  2. If you’re working with a large number of URLs, try turning down Screaming Frog’s crawl rate to something more polite, just in case you’re querying Google too much in too short a period of time.
  3. When you first open the Excel template, the formula may accidentally change depending on whether or not you have URL Tools or SEO Tools installed already. Read the instructions on the first page to find the correct formula to replace it with.

Let me know any other issues in the comments and I’ll do my best to help!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Dear Google Rivals: Do These New Proposals Solve Your Antitrust Concerns? Love, The EU

Three years into its investigation of Google over antitrust issues relating to search, the European Union took another step closer toward a likely resolution, asking Google rivals and other third-parties to review a settlement proposal. Notably, the general public wasn’t invited to…

Please visit Search Engine Land for the full article.

What You Need to Understand about Google Penguin Recovery

Anyone who runs an SEO agency has experienced the following scenario. A client comes to you and explains they ranked number one for a keyword(s) before Penguin. They got hit, having fallen into near invisibility in the Google SERPs and are desperate to get their Google traffic back and recover from Penguin. Manual vs. Algorithmic […]

The post What You Need to Understand about Google Penguin Recovery appeared first on Sugarrae.

Google’s Matt Cutts: More Pages Does Not Equal Higher Rankings

In a new video released today by Google’s head of search spam, Matt Cutts, we learn that the more pages a web site has, does not necessarily mean you will have better rankings. Matt Cutts said, “I wouldn’t assume that just because you have a large number of indexed pages that you…

Please visit Search Engine Land for the full article.