SEO 2014

We’re at the start of 2014.

SEO is finished.

Well, what we had come to know as the practical execution of “whitehat SEO” is finished. Google has defined it out of existence. Research keyword. Write page targeting keyword. Place links with that keyword in the link. Google cares not for this approach.

SEO, as a concept, is now an integral part of digital marketing. To do SEO in 2014 – Google-compliant, whitehat SEO – digital marketers must seamlessly integrate search strategy into other aspects of digital marketing. It’s a more complicated business than traditional SEO, but offers a more varied and interesting challenge, too.

Here are a few things to think about for 2014.

1. Focus On Brand

Big brands not only get a free pass, they can get extra promotion. By being banned. Take a look at Rap Genius. Aggressive link-building strategy leads to de-indexing. A big mea culpa follows and what happens? Not only do they get reinstated, they’ve earned themselves a wave of legitimate links.

Now that’s genius.

Google would look deficient if they didn’t show that site as visitors would expect to find it – enough people know the brand. To not show a site that has brand awareness would make Google look bad.

Expedia’s link profile was similarly outed for appearing to be at odds with Google’s published standards. Could a no-name site pass a hand inspection if they use aggressive linking? Unlikely.

What this shows is that if you have a brand important enough so that Google would look deficient by excluding it, then you will have advantages that no-name sites don’t enjoy. You will more likely pass manual inspections, and you’re probably more likely to get penalties lifted.

What is a brand?

In terms of search, it’s a site that visitors can use a brand search to find. Just how much search volume you require is open to debate, but you don’t need to be a big brand like Apple, or Trip Advisor or Microsoft. Rap Genius aren’t. Ask “would Google look deficient if this site didn’t show up” and you can usually tell that by looking for evidence of search volume on a sites name.

In advertising, brands have been used to secure a unique identity. That identity is associated with a product or service by the customer. Search used to be about matching a keyword term. But as keyword areas become saturated, and Google returns fewer purely keyword-focused pages anyway, developing a unique identity is a good way forward.

If you haven’t already, put some work into developing a cohesive, unique brand. If you have a brand, then think about generating more awareness. This may mean higher spends on brand-related advertising than you’ve allocated in previous years. The success metric is an increase in brand searches i.e. the name of the site.

2. Be Everywhere

The idea of a stand-alone site is becoming redundant. In 2014, you need to be everywhere your potential visitors reside. If your potential visitors are spending all day in Facebook, or YouTube, that’s where you need to be, too. It’s less about them coming to you, which is the traditional search metaphor, and a lot more about you going to them.

You draw visitors back to your site, of course, but look at every platform and other site as a potential extension of your own site. Pages or content you place on those platforms are yet another front door to your site, and can be found in Google searches. If you’re not where your potential visitors are, you can be sure your competitors will be, especially if they’re investing in social media strategies.

A reminder to see all channels as potential places to be found.

Mix in cross-channel marketing with remarketing and consumers get the perception that your brand is bigger. Google ran the following display ad before they broadly enabled retargeting ads. Retargeting only further increases that lift in brand searches.

3. Advertise Everywhere

Are you finding it difficult to get top ten in some areas? Consider advertising with AdWords and on the sites that already hold those positions. Do some brand advertising on them to raise awareness and generate some brand searches. An advert placed on a site that offers a complementary good or service might be cheaper than going to the expense and effort needed to rank. It might also help insulate you from Google’s whims.

The same goes for guest posts and content placement, although obviously you need to be a little careful as Google can take a dim view of it. The safest way is to make sure the content you place is unique, valuable and has utility in it’s own right. Ask yourself if the content would be equally at home on your own site if you were to host it for someone else. If so, it’s likely okay.

4. Valuable Content

Google does an okay job of identifying good content. It could do better. They’ve lost their way a bit in terms of encouraging production of good content. It’s getting harder and harder to make the numbers work in order to cover the production cost.

However, it remains Google’s mission to provide the user with answers the visitor deems relevant and useful. The utility of Google relies on it. Any strategy that is aligned with providing genuine visitor utility will align with Google’s long term goals.

Review your content creation strategies. Content that is of low utility is unlikely to prosper. While it’s still a good idea to use keyword research as a guide to content creation, it’s a better idea to focus on topic areas and creating engagement through high utility. What utility is the user expecting from your chosen topic area? If it’s rap lyrics for song X, then only the rap lyrics for song X will do. If it is plans for a garden, then only plans for a garden will do. See being “relevant” as “providing utility”, not keyword matching.

Go back to the basic principles of classifying the search term as either Navigational, Informational, or Transactional. If the keyword is one of those types, make sure the content offers the utility expected of that type. Be careful when dealing with informational queries that Google could use in it’s Knowledge Graph. If your pages deal with established facts that anyone else can state, then you have no differentiation, and that type of information is more likely to end up as part of Google’s Knowledge Graph. Instead, go deep on information queries. Expand the information. Associate it with other information. Incorporate opinion.

BTW, Bill has some interesting reading on the methods by which Google might be identifying different types of queries.

Methods, systems, and apparatus, including computer program products, for identifying navigational resources for queries. In an aspect, a candidate query in a query sequence is selected, and a revised query subsequent to the candidate query in the query sequence is selected. If a quality score for the revised query is greater than a quality score threshold and a navigation score for the revised query is greater than a navigation score threshold, then a navigational resource for the revised query is identified and associated with the candidate query. The association specifies the navigational resource as being relevant to the candidate query in a search operation.

5. Solve Real Problems

This is a follow-on from “ensuring you provide content with utility”. Go beyond keyword and topical relevance. Ask “what problem is the user is trying to solve”? Is it an entertainment problem? A “How To” problem? What would their ideal solution look like? What would a great solution look like?

There is no shortcut to determining what a user finds most useful. You must understand the user. This understanding can be gleaned from interviews, questionnaires, third party research, chat sessions, and monitoring discussion forums and social channels. Forget about the keyword for the time being. Get inside a visitors head. What is their problem? Write a page addressing that problem by providing a solution.

6. Maximise Engagement

Google are watching for the click-back to the SERP results, an action characteristic of a visitor who clicked through to a site and didn’t deem what they found to be relevant to the search query in terms of utility. Relevance in terms of subject match is now a given.

Big blocks of dense text, even if relevant, can be off-putting. Add images and videos to pages that have low engagement and see if this fixes the problem. Where appropriate, make sure the user takes an action of some kind. Encourage the user to click deeper into the site following an obvious, well placed link. Perhaps they watch a video. Answer a question. Click a button. Anything that isn’t an immediate click back.

If you’ve focused on utility, and genuinely solving a users problem, as opposed to just matching a keyword, then your engagement metrics should be better than the guy who is still just chasing keywords and only matching in terms of relevance to a keyword term.

7. Think Like A PPCer

Treat every click like you were paying for it directly. Once that visitor has arrived, what is the one thing you want them to do next? Is it obvious what they have to do next? Always think about how to engage that visitor once they land. Get them to take an action, where possible.

8.Think Like A Conversion Optimizer

Conversion optimization tries to reduce the bounce-rate by re-crafting the page to ensure it meets the users needs. They do this by split testing different designs, phrases, copy and other elements on the page.

It’s pretty difficult to test these things in SEO, but it’s good to keep this process in mind. What pages of your site are working well and which pages aren’t? Is it anything to do with different designs or element placement? What happens if you change things around? What do the three top ranking sites in your niche look like? If their link patterns are similar to yours, what is it about those sites that might lead to higher engagement and relevancy scores?

9. Rock Solid Strategic Marketing Advantage

SEO is really hard to do on generic me-too sites. It’s hard to get links. It’s hard to get anyone to talk about them. People don’t share them with their friends. These sites don’t generate brand searches. The SEO option for these sites is typically what Google would describe as blackhat, namely link buying.

Look for a marketing angle. Find a story to tell. Find something unique and remarkable about the offering. If a site doesn’t have a clearly-articulated point of differentiation, then the harder it is to get value from organic search if your aim is to do so whilst staying within the guidelines.

10. Links

There’s a reason Google hammer links. It’s because they work. Else, surely Google wouldn’t make a big deal about them.

Links count. It doesn’t matter if they are no-follow, scripted, within social networks, or wherever, they are still paths down which people travel. It comes back to a clear point of differentiation, genuine utility and a coherent brand. It’s a lot easier, and safer, to link build when you’ve got all the other bases covered first.

Categories: 

Did @mattcutts Endorse Rap Genius Link Spam?

On TWIG Matt Cutts spoke about the importance of defunding spammers & breaking their spirits.

If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits. You want to make them frustrated and angry. There are parts of Google’s algorithms specifically designed to frustrate spammers and mystify them and make them frustrated. And some of the stuff we do gives people a hint their site is going to drop and then a week or two later their site actually does drop so they get a little bit more frustrated. And so hopefully, and we’ve seen this happen, people step away from the dark side and say “you know, that was so much pain and anguish and frustration, let’s just stay on the high road from now on” some of the stuff I like best is when people say “you know this SEO stuff is too unpredictable, I’m just going to write some apps. I’m going to go off and do something productive for society.” And that’s great because all that energy is channeled at something good.

What was less covered was that in the same video Matt Cutts made it sound like anything beyond information architecture, duplicate content cleanup & clean URLs was quickly approaching scamming – especially anything to do with links. So over time more and more behaviors get reclassified as black hat spam as Google gains greater control over the ecosystem.

there’s the kind of SEO that is better architecture, cleaner URLs, not duplicate content … that’s just like making sure your resume doesn’t have any typos on it. that’s just clever stuff. and then there’s the type of SEO that is sort of cheating. trying to get a lot of bad backlinks or scamming, and that’s more like lying on your resume. when you get caught sometime’s there’s repercussions. and it definitely helps to personalize because now anywhere you search for plumbers there’s local results and they are not the same across the world. we’ve done a diligent job of trying to crack down on black hat spam. so we had an algorithm named Penguin that launched that kind of had a really big impact. we had a more recent launch just a few months ago. and if you go and patrole the black hat SEO forums where the guys talk about the techniques that work, now its more people trying to sell other people scams rather than just trading tips. a lot of the life has gone out of those forums. and even the smaller networks that they’re trying to promote “oh buy my anglo rank or whatever” we’re in the process of tackling a lot of those link networks as well. the good part is if you want to create a real site you don’t have to worry as much about these bad guys jumping ahead of you. the playing ground is a lot more level now. panda was for low quality. penguin was for spam – actual cheating.

The Matt Cutts BDSM School of SEO

As part of the ongoing campaign to “break their spirits” we get increasing obfuscation, greater time delays between certain algorithmic updates, algorithmic features built explicitly with the goal of frustrating people, greater brand bias, and more outrageous selective enforcement of the guidelines.

Those who were hit by either Panda or Penguin in some cases took a year or more to recover. Far more common is no recovery — ever. How long do you invest in & how much do you invest in a dying project when the recovery timeline is unknown?

You Don’t Get to Fascism Without 2-Tier Enforcement

While success in and of itself may make one a “spammer” to the biased eyes of a search engineer (especially if you are not VC funded nor part of a large corporation), many who are considered “spammers” self-regulate in a way that make them far more conservative than the alleged “clean” sites do.

Pretend you are Ask.com and watch yourself get slaughtered without warning.

Build a big brand & you will have advanced notification & free customer support inside the GooglePlex:

In my experience with large brand penalties, (ie, LARGE global brands) Google have reached out in advance of the ban every single time. – Martin Macdonald

Launching a Viral Linkspam Sitemap Campaign

When RapGenius was penalized, the reason they were penalized is they were broadly and openly and publicly soliciting to promote bloggers who would dump a list of keyword rich deeplinks into their blog posts. They were basically turning boatloads of blogs into mini-sitemaps for popular new song albums.

Remember reading dozens (hundreds?) of blog posts last year about how guest posts are spam & Google should kill them? Well these posts from RapGenius were like a guest post on steroids. The post “buyer” didn’t have to pay a single cent for the content, didn’t care at all about relevancy, AND a sitemap full of keyword rich deep linking spam was included in EACH AND EVERY post.

Most “spammers” would never attempt such a campaign because they would view it as being far too spammy. They would have a zero percent chance of recovery as Google effectively deletes their site from the web.

And while RG is quick to distance itself from scraper sites, for almost the entirety of their history virtually none of the lyrics posted on their site were even licensed.

In the past I’ve mentioned Google is known to time the news cycle. It comes without surprise that on a Saturday barely a week after being penalized Google restored RapGenius’s rankings.

How to Gain Over 400% More Links, While Allegedly Losing

While the following graph may look scary in isolation, if you know the penalty is only a week or two then there’s virtually no downside.

Since being penalized, RapGenius has gained links from over 1,000* domains

  • December 25th: 129
  • December 26th: 85
  • December 27th: 87
  • December 28th: 54
  • December 29th: 61
  • December 30th: 105
  • December 31st: 182
  • January 1st: 142
  • January 2nd: 112
  • January 3rd: 122

The above add up to 1,079 & RapGenius only has built a total of 11,930 unique linking domains in their lifetime. They grew about 10% in 10 days!

On every single day the number of new referring domains VASTLY exceeded the number of referring domains that disappeared. And many of these new referring domains are the mainstream media and tech press sites, which are both vastly over-represented in importance/authority on the link graph. They not only gained far more links than they lost, but they also gained far higher quality links that will be nearly impossible for their (less spammy) competitors to duplicate.

They not only got links, but the press coverage acted as a branded advertising campaign for RapGenius.

Here’s some quotes from RapGenius on their quick recovery:

  • “we owe a big thanks to Google for being fair and transparent and allowing us back onto their results pages” <– Not the least bit true. RapGenius was not treated fairly, but rather they were given a free ride compared to the death hundreds of thousands of small businesses have been been handed over the past couple years.
  • “On guest posts, we appended lists of song links (often tracklists of popular new albums) that were sometimes completely unrelated to the music that was the subject of the post.” <– and yet others are afraid of writing relevant on topic posts due to Google’s ramped fearmongering campaigns
  • “we compiled a list of 100 “potentially problematic domains”” <– so their initial list of domains to inspect was less than 10% the number of links they gained while being penalized
  • “Generally Google doesn’t hold you responsible for unnatural inbound links outside of your control” <– another lie
  • “of the 286 potentially problematic URLs that we manually identified, 217 (more than 75 percent!) have already had all unnatural links purged.” <– even the “all in” removal of pages was less than 25% of the number of unique linking domains generated during the penalty period

And Google allowed the above bullshit during a period when they were sending out messages telling other people WHO DID THINGS FAR LESS EGREGIOUS that they are required to remove more links & Google won’t even look at their review requests for at least a couple weeks – A TIME PERIOD GREATER THAN THE ENTIRE TIME RAPGENIUS WAS PENALIZED FOR.

Failed reconsideration requests are now coming with this email that tells site owners they must remove more links: pic.twitter.com/tiyXtPvY32— Marie Haynes (@Marie_Haynes) January 2, 2014

In Conclusion…

If you tell people what works and why you are a spammer with no morals. But if you are VC funded, Matt Cutts has made it clear that you should spam the crap out of Google. Just make sure you hire a PR firm to trump up press coverage of the “unexpected” event & then have a faux apology saved in advance. So long as you lie to others and spread Google’s propaganda you are behaving in an ethical white hat manner.

Google & @mattcutts didn’t ACTUALLY care about Rap Genius’ link scheme, they just didn’t want to miss a propaganda opportunity.— Ben Cook (@Skitzzo) January 4, 2014

Notes

* These stats are from Ahrefs. A few of these links may have been in place before the penality and only recently crawled. However it is also worth mentioning that all third party databases of links are limited in size & refresh rate by optimizing their capital spend, so there are likely hundreds more links which have not yet been crawled by Ahrefs. One should also note that the story is still ongoing and they keep generating more links every day. By the time the story is done spreading they are likely to see roughly a 30% growth in unique linking domains in about 6 weeks.

Categories: 

Gray Hat Search Engineering

Almost anyone in internet marketing who has spent a couple months in the game has seen some “shocking” case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had not had any focus on conversion at all.

Google, on the other hand, has billions of daily searches and is constantly testing ways to increase yield:

The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.

By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.

One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers – sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.

That’s how monopolies work – according to Eric Schmidt they are immune from market forces.

Search itself is the original “native ad.” The blend confuses many searchers as the background colors fade into white.

Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.

It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.

I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.

Where Google gets paid for the link, the link is blue.

Where Google scrapes third party content & shows excerpts, the link is gray.

The primary goal of such a knowledge block is result displacement – shifting more clicks to the ads and away from the organic results.

When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it’s below the fold.

What’s so bad about this practice in health

  • Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google’s remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
  • Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can’t find a pharmaceutical company worth $10s of billions that hasn’t plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
  • Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.

Where’s the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?

Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?

Where do you place your chips?

Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains.

Categories: 

Quickly Reversing the Fat Webmaster Curse

Long story short, -38 pounds in about 2 months or so. Felt great the entire time and felt way more focused day to day. Maybe you don’t have a lot of weight to lose but this whole approach can significantly help you cognitively.

In fact, the diet piece was originally formed for cognitive enhancements rather than weight loss.

Before I get into this post I just want to explicitly state that I am not a doctor, medical professional, medical researcher, or any type of medical/health anything. This is not advice, I am just sharing my experience.

You should consult a healthcare professional before doing any of this stuff.

Unhealthy Work Habits

The work habits associated with being an online marketer lend themselves to packing on some weight. Many of us are in front of a screen for large chunks of the day while also being able to take breaks whenever and wherever we want.

Sometimes those two things add up to a rather sedentary lifestyle with poor eating habits (I’m leaving travel out at the moment but that doesn’t help either).

In addition to the mechanics of our jobs being an issue we also tend to work longer/odd hours because all we really need to get a chunk of our work done is a computer or simply access to the internet. If you take all of those things and add them into the large amount of opportunity that exists on the web you have the perfect recipe for unhealthy, stressful work habits.

These habits tend to carry over into offline areas as well. Think about the things we touch or access every day:

  • Computers
  • Tablets
  • Smartphones
  • Search Engines
  • Online Tools
  • Email
  • Instant Messaging
  • Social Networks

What do many of these have in common? Instant “something”. Instant communication, results, gratification, and on and on. This is what we live in every day. We expect and probably prefer fast, instant, and quick. With that mindset, who has time to cook a healthy meal 3x per day on a regular basis? Some do, for sure. However, much like the office work environment this environment can be one that translates into lots of unhealthy habits and poor health.

I got to the point where I was about 40 pounds overweight with poor physicals and lackluster lipid profiles (high cholesterol, blood pressure, etc). I tried many things, many times but what ultimately turned the corner for me were 3 different investments.

Investment #1 – Standup/Sitdown Desk

Sitting down all day is no bueno. I bought a really high quality standup desk with an electrical motor so I can periodically sit down for a few minutes in between longer periods of standing.

It has a nice, wide table component and is quite sturdy. It also allows for different height adjustments via a simple up and down control:

A couple of tips here:

  • Wear comfy shoes
  • Take periodic breaks (I do so hourly) to go walk around the house or office or yard
  • I also like to look away from the CPU every 20-30 minutes or so, sometimes I get eyestrain but I bought these glasses from Gunnar and it’s relieved those symptoms

Investment #2 – Treaddesk

The reason I didn’t buy the full-on treadmill desk is because I wanted a bigger desk with more options. I bought the Treaddesk, which is essentially the bottom part of the treadmill, and I move it around during the week based on my workflow needs:

They have packages available as well (see the above referenced link).

I have a second, much cheaper standup desk that I hacked together from IKEA:

This desk acts as a holder for my coffee stuff but also allows me to put my laptop on it (which is paired with an external keyboard and trackpad) in case I want to do some lighter work (I have a hard time doing deeper work when doing the treadmill part while working).

I move the Treaddesk back and forth sometimes, but mostly it stays with this IKEA desk. If I have a week where the work is not as deeply analytical and more administrative then I’ll walk at a lower speed on the main computer for a longer period of time.

I tend to walk about 5-7 miles a day on this thing, usually in a block of time where I do that lighter-type work (Quickbooks, cobbling reports together, email triage, very light research/writing, reading, and so on).

Investment #3 – Bulletproof Coffee and the Bulletproof Diet

I’m a big fan of Joe Rogan in general, and I enjoy his podcast. I heard about Dave Asprey on the JRE podcast so I eventually ended up on his site, bulletproofexec.com. I purchased some private coaching with the relevant products and I was off to the races.

I did my own research on some of the stuff and came away confident that “good” fat had been erroneously hated on for years. I highly encourage you to conduct your own research based on your own personal situation, again this is not advice.

I really wanted to drop about 50 pounds so I went all in with Bulletproof Intermittent Fasting. A few quick points:

  • I felt great the entire time
  • In rare moments where I was hungry at night I just had a tablespoon of organic honey
  • I certainly felt a cognitive benefit
  • I was never hungry
  • I was much more patient with things 
  • I felt way more focused

So yeah, butter in the coffee and a mostly meat/veggie diet. I cheated from time to time, certainly over the holiday. I lost 38 pounds in slightly over 60 days. Here’s a before and after:

Fat Eric

Not So Fat Eric

I kept this post kind of short and to the point because my desire is not to argue or fight about whether carbs are good or bad, whether fat is good or bad, whether X is right, or whether Y is wrong. This is what worked for me and I was amazed by it, totally amazed by the outcome.

I also do things like cycling and martial arts but I’ve been doing those for awhile, along with running, and while I’ve lost weight I’ve never had it melt away like this.

I’ve stopped the fasting portion and none of the weight has piled back on. Lipid tests have been very positive as well, best in years.

Even if you don’t have a ton of weight to lose, seriously think about the standup desk and treadmill. 

Happy New Year!

Categories: 

Should Venture Backed Startups Engage in Spammy SEO?

Here’s a recent video of the founders of RapGenius talking at TechCrunch disrupt.

Oops, wrong video. Here’s the right one. Same difference.

Recently a thread on Hacker News highlighted a blog post which pointed how RapGenius was engaging in reciprocal promotional arrangements where they would promote blogs on their Facebook or Twitter accounts if those bloggers would post a laundry list of keyword rich deeplinks at RapGenius.

Matt Cutts quickly chimed in on Hacker News “we’re investigating this now.”

A friend of mine and I were chatting yesterday about what would happen. My prediction was that absolutely nothing would happen to RapGenius, they would issue a faux apology, they would put no effort into cleaning up the existing links, and the apology alone would be sufficient evidence of good faith that the issue dies there.

Today RapGenius published a mea culpa where ultimately they defended their own spam by complaining about how spammy other lyrics websites are. The self-serving jackasses went so far as including this in their post: “With limited tools (Open Site Explorer), we found some suspicious backlinks to some of our competitors”

It’s one thing to in private complain about dealing in a frustrating area, but it’s another thing to publicly throw your direct competitors under the bus with a table of link types and paint them as being black hat spammers.

Google can’t afford to penalize Rap Genius, because if they do Google Ventures will lose deal flow on the start ups Google co-invests in.

In the past some of Google’s other investments were into companies that were pretty overtly spamming. RetailMeNot held multiple giveaways where if you embedded a spammy sidebar set of deeplinks to their various pages they gave you a free t-shirt:

Google’s behavior on such arrangements has usually been to hit the smaller players while looking the other way on the bigger site on the other end of the transaction.

That free t-shirt for links post was from 2010 – the same year that Google invested in RetailMeNot. They did those promotions multiple times & long enough that they ran out of t-shirts!. Now that RTM is a publicly traded billion Dollar company which Google already endorsed by investing in, there’s a zero percent chance of them getting penalized.

To recap, if you are VC-backed you can: spam away, wait until you are outed, when outed reply with a combined “we didn’t know” and a “our competitors are spammers” deflective response.

For the sake of clarity, let’s compare that string of events (spam, warning but no penalty, no effort needed to clean up, insincere mea culpa) to how a websites are treated when not VC backed. For smaller sites it is “shoot on sight” first and then ask questions later, perhaps coupled with a friendly recommendation to start over.

Here’s a post from today highlighting a quote from Google’s John Mueller:

My personal & direct recommendation here would be to treat this site as a learning experience from a technical point of view, and then to find something that you’re absolutely passionate & knowledgeable about and create a website for that instead.

Growth hack inbound content marketing, but just don’t call it SEO.

Growth hacking = using 2005-era spam tactics. http://t.co/5ISCPmMEkp cc @samfbiddle @nitashatiku

— Max Woolf (@minimaxir) December 23, 2013

What’s worse, is with the new fearmongering disavow promotional stuff, not only are some folks being penalized for the efforts of others, but some are being penalized for links that were in place BEFORE Google even launched as a company.

Google wants me to disavow links that existed back when backrub was foreplay and not an algo. Hubris much?

— Cygnus SEO (@CygnusSEO) December 21, 2013

Given that money allegedly shouldn’t impact rankings, its sad to note that as everything that is effective gets labeled as spam, capital and connections are the key SEO innovations in the current Google ecosystem.

Categories: 

Beware Of SEO Truthiness

When SEO started, many people routinely used black-box testing to try any figure out what pages the search engines rewarded.

Black box testing is terminology used in IT. It’s a style of testing that doesn’t assume knowledge of the internal workings of a machine or computer program. Rather, you can only test how the system responds to inputs.

So, for many years, SEO was about trying things out and watching how the search engine responded. If rankings went up, SEOs assumed correlation meant causation, so they did a lot more of whatever it was they thought was responsible for the boost. If the trick was repeatable, they could draw some firmer conclusions about causation, at least until the search engine introduced some new algorithmic code and sent everyone back to their black-box testing again.

Well, it sent some people back to testing. Some SEO’s don’t do much, if any, testing of their own, and so rely on the strategies articulated by other people. As a result, the SEO echo chamber can be a pretty misleading place as “truthiness” – and a lot of false information – gets repeated far and wide, until it’s considered gospel. One example of truthiness is that paid placement will hurt you. Well, it may do, but not having it may hurt you more, because it all really…..depends.

Another problem is that SEO testing can seldom be conclusive, because you can’t be sure of the state of the thing you’re testing. The thing you’re testing may not be constant. For example, you throw up some more links, and your rankings rise, but the rise could be due to other factors, such as a new engagement algorithm that Google implemented in the middle of your testing, you just didn’t know about it.

It used to be a lot easier to conduct this testing. Updates were periodic. Up until that point, you could reasonably assume the algorithms were static, so cause and effect were more obvious than they are today. Danny Sullivan gave a good overview of search history at Moz earlier in the year:

That history shows why SEO testing is getting harder. There are a lot more variables to isolate that there used to be. The search engines have also been clever. A good way to thwart SEO black box testing is to keep moving the target. Continuously roll out code changes and don’t tell people you’re doing it. Or send people on a wild goose chase by arm-waving about a subtle code change made over here, when the real change has been made over there.

That’s the state of play in 2013.

However….(Ranting Time :)

Some SEO punditry is bordering on the ridiculous!

I’m not going to link to one particular article I’ve seen recently, as, ironically, that would mean rewarding them for spreading FUD. Also, calling out people isn’t really the point. Suffice to say, the advice was about specifics, such as how many links you can “safely” get from one type of site, that sort of thing….

The problem comes when we can easily find evidence to the contrary. In this case, a quick look through the SERPs and you’ll find evidence of top ranking sites that have more than X links from Site Type Y, so this suggests….what? Perhaps these sites are being “unsafe”, whatever that means. A lot of SEO punditry is well meaning, and often a rewording of Google’s official recommendations, but can lead people up the garden path if evidence in the wild suggests otherwise.

If one term defined SEO in 2013, it is surely “link paranoia”.

What’s Happening In The Wild

When it comes to what actually works, there are few hard and fast rules regarding links. Look at the backlink profiles for top ranked sites across various categories and you’ll see one thing that is constant….

Nothing is constant.

Some sites have links coming from obviously automated campaigns, and it seemingly doesn’t affect their rankings. Other sites have credible link patterns, and rank nowhere. What counts? What doesn’t? What other factors are in play? We can only really get a better picture by asking questions.

Google allegedly took out a few major link networks over the weekend. Anglo Rank came in for special mention from Matt Cutts.

So, why are Google making a point of taking out link networks if link networks don’t work? Well, it’s because link networks work. How do we know? Look at the back link profiles in any SERP area where there is a lot of money to be made, and the area isn’t overly corporate i.e. not dominated by major brands, and it won’t be long before you spot aggressive link networks, and few “legitimate” links, in the backlink profiles.

Sure, you wouldn’t want aggressive link networks pointing at brand sites, as there are better approaches brand sites can take when it comes to digital marketing, but such evidence makes a mockery of the tips some people are freely handing out. Are such tips the result of conjecture, repeating Google’s recommendations, or actual testing in the wild? Either the link networks work, or they don’t work but don’t affect rankings, or these sites shouldn’t be ranking.

There’s a good reason some of those tips are free, I guess.

Risk Management

Really, it’s a question of risk.

Could these sites get hit eventually? Maybe. However, those using a “disposable domain” approach will do anything that works as far as linking goes, as their main risk is not being ranked. Being penalised is an occupational hazard, not game-over. These sites will continue so long as Google’s algorithmic treatment rewards them with higher ranking.

If your domain is crucial to your brand, then you might choose to stay away from SEO entirely, depending on how you define “SEO”. A lot of digital marketing isn’t really SEO in the traditional sense i.e. optimizing hard against an algorithm in order to gain higher rankings, a lot of digital marketing is based on optimization for people, treating SEO as a side benefit. There’s nothing wrong with this, of course, and it’s a great approach for many sites, and something we advocate. Most sites end up somewhere along that continuum, but no matter where you are on that scale, there’s always a marketing risk to be managed, with perhaps “non-performance” being a risk that is often glossed over.

So, if there’s a take-away, it’s this: check out what actually happens in the wild, and then evaluate your risk before emulating it. When pundits suggest a rule, check to see if you can spot times it appears to work, and perhaps more interestingly, when it doesn’t. It’s in those areas of personal inquiry and testing where gems of SEO insight are found.

SEO has always been a mix of art and science. You can test, but only so far. The art part is dealing with the unknown past the testing point. Performing that art well is to know how to pick truthiness from reality.

And that takes experience.

But mainly a little fact checking :)

Categories: 

SEO Discussions That Need to Die

Sometimes the SEO industry feels like one huge Groundhog Day. No matter how many times you have discussions with people on the same old topics, these issues seem to pop back into blogs/social media streams with almost regular periodicity. And every time it does, just the authors are new, the arguments and the contra arguments are all the same.

Due to this sad situation, I have decided to make a short list of such issues/discussions and hopefully if one of you is feeling particularly inspired by it and it prevents you from starting/engaging in such a debate, then it was worth writing.

So here are SEO’s most annoying discussion topics, in no particular order:

Blackhat vs. Whitehat

This topic has been chewed over and over again so many times, yet people still jump into it with both their feet, having righteous feeling that their, and no one else’s argument is going to change someone’s mind. This discussion is becomes particularly tiresome when people start claiming moral high ground because they are using one over the other. Let’s face it once and for all times: there are no generally moral (white) and generally immoral (black) SEO tactics.

This is where people usually pull out the argument about harming clients’ sites, argument which is usually moot. Firstly, there is a heated debate about what is even considered whitehat and what blackhat. Definition of these two concepts is highly fluid and changes over time. One of the main reasons for this fluidity is Google moving the goal posts all the time. What was once considered purely whitehat technique, highly recommended by all the SEOs (PR submissions, directories, guest posts, etc.) may as of tomorrow become “blackhat”, “immoral” and what not. Also some people consider “blackhat” anything that dares to not adhere to Google Webmaster Guidelines as if it was carved in on stone tablets by some angry deity.

Just to illustrate how absurd this concept is, imagine some other company, Ebay say, creates a list of rules, one of which is that anyone who wants to sell an item on their site, is prohibited from trying to sell it also on Gumtree or Craigslist. How many of you would practically reduce the number of people your product is effectively reaching because some other commercial entity is trying to prevent competition? If you are not making money off search, Google is and vice versa.

It is not about the morals, it is not about criminal negligence of your clients. It is about taking risks and as long as you are being truthful with your clients and yourself and aware of all the risks involved in undertaking this or some other activity, no one has the right to pontificate about “morality” of a competing marketing strategy. If it is not for you, don’t do it, but you can’t both decide that the risk is too high for you while pseudo-criminalizing those who are willing to take that risk.

The same goes for “blackhatters” pointing and laughing at “whitehatters”. Some people do not enjoy rebuilding their business every 2 million comment spam links. That is OK. Maybe they will not climb the ranks as fast as your sites do, but maybe when they get there, they will stay there longer? These are two different and completely legitimate strategies. Actually, every ecosystem has representatives of those two strategies, one is called “r strategy” which prefers quantity over quality, while the K strategy puts more investment in a smaller number of offsprings.

You don’t see elephants calling mice immoral, do you?

Rank Checking is Useless/Wrong/Misleading

This one has been going around for years and keeps raising its ugly head every once in a while, particularly after Google forces another SaaS provider to give up part of its services because of either checking rankings themselves or buying ranking data from a third party provider. Then we get all the holier-than-thou folks, mounting their soap boxes and preaching fire and brimstone on SEOs who report rankings as the main or even only KPI. So firstly, again, just like with black vs. white hat, horses for courses. If you think your way of reporting to clients is the best, stick with it, preach it positively, as in “this is what I do and the clients like it” but stop telling other people what to do!

More importantly, vast majority of these arguments are based on a totally imaginary situation in which SEOs use rankings as their only or main KPI. In all of my 12 years in SEO, I have never seen any marketer worth their salt report “increase in rankings for 1000s of keywords”. As far back as 2002, I remember people were writing reports to clients which had a separate chapter for keywords which were defined as optimization targets, client’s site reached top rankings but no significant increase in traffic/conversions was achieved. Those keywords were then dropped from the marketing plan altogether.

It really isn’t a big leap to understand that ranking isn’t important if it doesn’t result in increased conversions in the end. I am not going to argue here why I do think reporting and monitoring rankings is important. The point is that if you need to make your argument against a straw man, you should probably rethink whether you have a good argument at all.

PageRank is Dead/it Doesn’t Matter

Another strawman argument. Show me a linkbuilder who today thinks that getting links based solely on toolbar PageRank is going to get them to rank and I will show you a guy who has probably not engaged in active SEO since 2002. And not a small amount of irony can be found in the fact that the same people who decry use of Pagerank, a closest thing to an actual Google ranking factor they can see, are freely using proprietary metrics created by other marketing companies and treating them as a perfectly reliable proxy for esoteric concepts which even Google finds hard to define, such as relevance and authority. Furthermore, all other things equal, show me the SEO who will take a pass on a PR6 link for the sake of a PR3 one.

Blogging on “How Does XXX Google Update Change Your SEO” – 5 Seconds After it is Announced

Matt hasn’t turned off his video camera to switch his t-shirt for the next Webmaster Central video and there are already dozens of blog posts discussing to the most intricate of details on how the new algorithm update/penalty/infrastructure change/random- monochromatic-animal will impact everyone’s daily routine and how we should all run for the hills.

Best-case scenario, these prolific writers only know the name of the update and they are already suggesting strategies on how to avoid being slapped or, even better, get out of the doghouse. This was painfully obvious in the early days of Panda, when people were writing their “experiences” on how to recover from the algorithm update even before the second update was rolled out, making any testimony of recovery, in the worst case, a lie or (given a massive benefit of the doubt) a misinterpretation of ranking changes (rank checking anyone).

Put down your feather and your ink bottle skippy, wait for the dust to settle and unless you have a human source who was involved in development or implementation of the algorithm, just sit tight and observe for the first week or two. After that you can write those observations and it will be considered a legitimate, even interesting reporting on the new algorithm but anything earlier than that will paint you as a clueless, pageview chaser, looking to ride the wave of interest with blog post that are often closed with “we will probably not even know what the XXX update is all about until we give it some time to get implemented”. Captain Obvious to the rescue.

Adwords Can Help Your Organic Rankings

This one is like a mythological Hydra – you cut one head off, two new one spring out. This question was answered so many times by so many people, both from within search engines and from the SEO community, that if you are addressing this question today, I am suspecting that you are actually trying to refrain from talking about something else and are using this topic as a smoke screen. Yes, I am looking at you Google Webmaster Central videos. Is that *really* the most interesting question you found on your pile? What, no one asked about <not provided> or about social signals or about role authorship plays on non-personalized rankings or on whether it flows through links or million other questions that are much more relevant, interesting and, more importantly, still unanswered?

Infographics/Directories/Commenting/Forum Profile Links Don’t Work

This is very similar to the blackhat/whitehat argument and it is usually supported by a statement that looks something like “what do you think that Google with hundreds of PhDs haven’t already discounted that in their algorithm?”. This is a typical “argument from incredulity” by a person who glorifies post graduate degrees as a litmus of intelligence and ingenuity. My claim is that these people have neither looked at backlink profiles of many sites in many competitive niches nor do they know a lot of people doing or having a PhD. They highly underrate former and overrate the latter.

A link is a link is a link and the only difference is between link profiles and percentages that each type of link occupies in a specific link profile. Funnily enough, the same people who claim that X type of links don’t work are the same people who will ask for link removal from totally legitimate, authoritative sources who gave them a totally organic, earned link. Go figure.

“But Matt/John/Moultano/anyone-with a brother in law who has once visited Mountain View” said…

Hello. Did you order “not provided will be maximum 10% of your referral data”? Or did you have “I would be surprised if there was a PR update this year”? How about “You should never use nofollow on-site links that you don’t want crawled. But it won’t hurt you. Unless something.”?

People keep thinking that people at Google sit around all day long, thinking how they can help SEOs do their job. How can you build your business based on advice given out by an entity who is actively trying to keep visitors from coming to your site? Can you imagine that happening in any other business environment? Can you imagine Nike marketing department going for a one day training session in Adidas HQ, to help them sell their sneakers better?

Repeat after me THEY ARE NOT YOUR FRIENDS. Use your own head. Even better, use your own experience. Test. Believe your own eyes.

We Didn’t Need Keyword Data Anyway

This is my absolute favourite. People who were as of yesterday basing their reporting, link building, landing page optimization, ranking reports, conversion rate optimization and about every other aspect of their online campaign on referring keywords, all of a sudden fell the need to tell the world how they never thought keywords were an important metric. That’s right buster, we are so much better off flying blind, doing iteration upon iteration of a derivation of data based on past trends, future trends, landing pages, third party data, etc.

It is ok every once in a while to say “crap, Google has really shafted us with this one, this is seriously going to affect the way I track progress”. Nothing bad will happen if you do. You will not lose face over it. Yes there were other metrics that were ALSO useful for different aspects of SEO but it is not as if when driving a car and your brakes die on you, you say “pfffftt stopping is for losers anyway, who wants to stop the car when you can enjoy the ride, I never really used those brakes in the past anyway. What really matters in the car is that your headlights are working”.

Does this mean we can’t do SEO anymore? Of course not. Adaptability is one of the top required traits of an SEO and we will adapt to this situation as we did to all the others in the past. But don’t bullshit yourself and everyone else that 100% <not provided> didn’t hurt you.

Responses to SEO is Dead Stories

It is crystal clear why the “SEO is dead” stories themselves deserve to die a slow and painful death. I am talking here about hordes of SEOs who rise to the occasion every freeking time some 5th rate journalist decides to poke the SEO industry through the cage bars and convince them, nay, prove to them how SEO is not only not dying but is alive and kicking and bigger than ever. And I am not innocent of this myself, I have also dignified this idiotic topic with a response (albeit a short one) but how many times can we rise to the same occasion and repeat the same points? What original angle can you give to this story after 16 years of responding to the same old claims? And if you can’t give an original angle, how in the world are you increasing our collective knowledge by re-warming and serving the same old dish that wasn’t very good first time it was served? Don’t you have rankings to check instead?

There is No #10.

But that’s what everyone does, writes a “Top 10 ways…” article, where they will force the examples until they get to a linkbaity number. No one wants to read a “Top 13…” or a “Top 23…” article. This needs to die too. Write what you have to say. Not what you think will get most traction. Marketing is makeup, but the face needs to be pretty before you apply it. Unless you like putting lipstick on pigs.


Branko Rihtman has been optimizing sites for search engines since 2001 for clients and own web properties in a variety of competitive niches. Over that time, Branko realized the importance of properly done research and experimentation and started publishing findings and experiments at SEO Scientist, with some additional updates at @neyne. He currently consults a number of international clients, helping them improve their organic traffic and conversions while questioning old approaches to SEO and trying some new ones.

Categories: 

Value Based SEO Strategy

One approach to search marketing is to treat the search traffic as a side-effect of a digital marketing strategy. I’m sure Google would love SEOs to think this way, although possibly not when it comes to PPC! Even if you’re taking a more direct, rankings-driven approach, the engagement and relevancy scores that come from delivering what the customer values should serve you well, too.

In this article, we’ll look at a content strategy based on value based marketing. Many of these concepts may be familiar, but bundled together, they provide an alternative search provider model to one based on technical quick fixes and rank. If you want to broaden the value of your SEO offering beyond that first click, and get a few ideas on talking about value, then this post is for you.

In any case, the days of being able to rank well without providing value beyond the click are numbered. Search is becoming more about providing meaning to visitors and less about providing keyword relevance to search engines.

What Is Value Based Marketing?

Value based marketing is customer, as opposed to search engine, centric. In Values Based Marketing For Bottom Line Success, the authors focus on five areas:

  • Discover and quantify your customers’ wants and needs
  • Commit to the most important things that will impact your customers
  • Create customer value that is meaningful and understandable
  • Assess how you did at creating true customer value
  • Improve your value package to keep your customers coming back

Customers compare your offer against those of competitors, and divide the benefits by the cost to arrive at value. Marketing determines and communicates that value.

This is the step beyond keyword matching. When we use keyword matching, we’re trying to determine intent. We’re doing a little demographic breakdown. This next step is to find out what the customer values. If we give the customer what they value, they’re more likely to engage and less likely to click back.

What Does The Customer Value?

A key question of marketing is “which customers does this business serve”? Seems like an obvious question, but it can be difficult to answer. Does a gym serve people who want to get fit? Yes, but then all gyms do that, so how would they be differentiated?

Obviously, a gym serves people who live in a certain area. So, if our gym is in Manhattan, our customer becomes “someone who wants to get fit in Manhattan”. Perhaps our gym is upmarket and expensive. So, our customer becomes “people who want to get fit in Manhattan and be pampered and are prepared to pay more for it”. And so on, and so on. They’re really questions and statements about the value proposition as perceived by the customer, and then delivered by the business.

So, value based marketing is about delivering value to a customer. This syncs with Google’s proclaimed goal in search, which is to put users first by delivering results they deem to have value, and not just pages that match a keyword term. Keywords need to be seen in a wider context, and that context is pretty difficult to establish if you’re standing outside the search engine looking in, so thinking in terms of concepts related to the value proposition might be a good way to go.

Value Based SEO Strategy

The common SEO approach, for many years, has started with keywords. It should start with customers and the business.

The first question is “who is the target market” and then ask what they value.

Relate what they value to the business. What is the value proposition of the business? Is it aligned? What would make a customer value this business offering over those of competitors? It might be price. It might be convenience. It’s probably a mix of various things, but be sure to nail down the specific value propositions.

Then think of some customer questions around these value propositions. What would be the likely customer objections to buying this product? What would be points that need clarifying? How does this offer differ from other similar offers? What is better about this product or service? What are the perceived problems in this industry? What are the perceived problems with this product or service? What is difficult or confusing about it? What could go wrong with it? What risks are involved? What aspects have turned off previous customers? What complaints did they make?

Make a list of such questions. These are your article topics.

You can glean this information by either interviewing customers or the business owner. Each of these questions, and accompanying answer, becomes an article topic on your site, although not necessarily in Q&A format. The idea is to create a list of topics as a basis for articles that address specific points, and objections, relating to the value proposition.

For example, buying SEO services is a risk. Customers want to know if the money they spend is going to give them a return. So, a valuable article might be a case study on how the company provided return on spend in the past, and the process by which it will achieve similar results in future. Another example might be a buyer concerned about the reliability of a make of car. A page dedicated to reliability comparisons, and another page outlining the customer care after-sale plan would provide value. Note how these articles aren’t keyword driven, but value driven.

Ever come across a FAQ that isn’t really a FAQ? Dreamed-up questions? They’re frustrating, and of little value if the information doesn’t directly relate to the value we seek. Information should be relevant and specific so when people land on the site, there’s more chance they will perceive value, at least in terms of addressing the questions already on their mind.

Compare this approach with generic copy around a keyword term. A page talking about “SEO” in response to the keyword term “SEO“might closely match a keyword term, so that’s a relevance match, but unless it’s tied into providing a customer the value they seek, it’s probably not of much use. Finding relevance matches is no longer a problem for users. Finding value matches often is. Even if you’re keyword focused, added these articles provides you semantic variation that may capture keyword searches that aren’t appearing in keyword tools.

Keyword relevance was a strategy devised at a time when information was less readily available and search engines weren’t as powerful. Finding something relevant was more hit and miss that it is today. These days, there’s likely thousands, if not millions, of pages that will meet relevance criteria in terms of keyword matching, so the next step is to meet value criteria. Providing value is less likely to earn a click back and more likely to create engagement than mere on-topic matching.

The Value Chain

Deliver value. Once people perceive value, then we have to deliver it. Marketing, and SEO in particular, used to be about getting people over the threshold. Today, businesses have to work harder to differentiate themselves and a sound way of doing this is to deliver on promises made.

So the value is in the experience. Why do we return to Amazon? It’s likely due to the end-to-end experience in terms of delivering value. Any online e-commerce store can deliver relevance. Where competition is fierce, Google is selective.

In the long term, delivering value should drive down the cost of marketing as the site is more likely to enjoy repeat custom. As Google pushes more and more results beneath the fold, the cost of acquisition is increasing, so we need to treat each click like gold.

Monitor value. Does the firm keep delivering value? To the same level? Because people talk. They talk on Twitter and Facebook and the rest. We want them talking in a good way, but even if they talk in a negative way, it can still useful. Their complaints can be used as topics for articles. They can be used to monitor value, refine the offer and correct problems as they arise. Those social signals, whilst not a guaranteed ranking boost, are still signals. We need to adopt strategies whereby we listen to all the signals, so to better understand our customers, in order to provide more value, and hopefully enjoy a search traffic boost as a welcome side-effect, so long as Google is also trying to determine what users value. .

Not sounding like SEO? Well, it’s not optimizing for search engines, but for people. If Google is to provide value, then it needs to ensure results provide not just relevant, but offer genuine value to end users. Do Google do this? In many cases, not yet, but all their rhetoric and technical changes suggest that providing value is at the ideological heart of what they do. So the search results will most likely, in time, reflect the value people seek, and not just relevance.

In technical terms, this provides some interesting further reading:

Today, signals such as keyword co-occurrence, user behavior, and previous searches do in fact inform context around search queries, which impact the SERP landscape. Note I didn’t say the signals “impact rankings,” even though rank changes can, in some cases, be involved. That’s because there’s a difference. Google can make a change to the SERP landscape to impact 90 percent of queries and not actually cause any noticeable impact on rankings.

The way to get the context right, and get positive user behaviour signals, and align with their previous searches, is to first understand what people value.

Categories: 

Creating an Experience for Your Product

In a recent post I talked about the benefits of productizing your business model along with some functional ways to achieve productization.

A product, in and of itself is really only 1/2 of what you are selling to your clients. The other 1/2 of the equation is the “experience”.

It sounds a bit “fluffy” but in my career as a service provider and in my purchasing history as a consumer the experience matters. I would even go so far as to say that in some very noticeable cases the experience can outweigh the product itself (to some extent anyways).

These halves, the product and the experience, can cut both ways.

Sometimes a product is so good that the experience can be average or even below average and the provider will still make out and sometimes the experience is so fantastic that an otherwise average or above average product is elevated to what can be priced as a premium product or service.

Let’s get a few obvious variables out of the way first. It is understood that:

  1. Experience matters more to some people than others
  2. Experience matters more in certain industries than others
  3. The actual product matters more to some
  4. The actual product matters more in some industries

If we stipulate that the 4 scenarios mentioned above are true, which they are, it still doesn’t change the basic premise that you are probably leaving revenue and growth on the table if you settle on one side or the other.

While it’s true that you can be successful even if your product to experience ratio is like a seesaw heavily weighted in one direction over the other, it is also true that you would probably be more successful if you made both the best each could be.

Defining Where Product Meets Experience

I’ll layout a couple of examples here to help illustrate the point:

  • The “Big Four” in the link research tools space; Ahrefs, Link Research Tools, Majestic, and Open Site Explorer
  • The two more well-known “tool/reporting suites” Raven and Moz outside of much more expensive enterprise toolkits

In my experience Ahrefs has been the best combination of product and experience, especially lately. Their dataset continues to grow and recent UI changes have made it even easier to use. Exports are super fast and I’ve had quick and useful interactions with their support staff. Perhaps it isn’t a coincidence that, from groups of folks I interact with and follow online, Ahrefs continues to pop up more often in conversation than not.

To me, Majestic and Link Research Tools are examples of where the product is really, really strong (copious amounts of data across many segments) but the UI/UX is not quite as good as the others. I realize some of this is subjective but in other comparisons online this seems to be a prevailing theme.

Open Site Explorer has a fantastic UI/UX but the data can be a bit behind the others and getting data out (exporting) is bit more of a chore than point, click, download. It seems like over a period of time OSE has had a rougher road to data and update growth than the other tools I mentioned.

In the case of two of more popular reporting and research suites, Moz and Raven, Raven has really caught up (if not surpassed) Moz in terms of UI/UX. Raven pulls in data from multiple sources, including Moz, and has quite a few more (and easier to get to and cross-reference) features than Moz.

Moz may not be interested in getting into some of the other pieces of the online marketing puzzle that Raven is into but I think it’s still a valid comparison based on the very similar, basic purpose of each tool suite.

Assessing Your Current Position

When assessing or reassessing your products and offerings, a lot of it goes back to targeting the right market.

  • Is the market big enough to warrant investment into a product?
  • How many different segments of a given market do you need to appeal to?
  • Where’s the balance between feature bloat (think Zoho CRM) versus “good enough” functionality with an eye towards an incredible UX (think Highrise CRM)?

If the market isn’t big enough and you have to go outside your initial target, how will that affect the balance between the functionality of your product and the experience for your users, customers, or clients?

If you are providing SEO services your “functionality” might be how easy it is to determine the reports you provide and their relationship(s) to a client’s profitability or goals (or both). Your “experience” is likely a combination of things:

  • The graphical presentation of your documents
  • The language used in your reports and other interactions with the client
  • The consistency of your “brand” across the web
  • The consistency of your brand presentation (website, invoices, reports, etc)
  • Client ability to access reports and information quickly without having to ask you for it
  • Consistency of your information delivery (are you always on-time, late, or erratic with due dates, meetings, etc)

When you breakdown what you think is your “product” and “experience” you’ll likely find that it is pretty simple to develop a plan to improve both, rather than beating the vague “let’s do great things” company line that no one really understands but just nods at.

Example of Experience in Action

In just about every Consumer Reports survey Apple comes out on top for customer satisfaction. Apple, whether you like their products/”culture” or not, creates a fairly reliable, if not expensive, end to end experience. This is doubly true if you live near an Apple store.

If you look at laptop failure rates Apple is generally in the middle of the pack. There are other things that go into the Apple experience (using the OS and such) but part of the reason people are willing to pay that premium is due to their support options and ability to fix bugs fairly quickly.

To tie this into our industry, I think Moz is a good parallel example here. Their design is generally heralded as being quite pleasant and it’s pretty easy to use their tools; there isn’t a steep learning curve to using most of their products.

I think their product presentation is top notch, even though I generally prefer some of their competitors products. They are pretty active on social media and their support is generally very good.

So, in the case of Moz it’s pretty clear that people are willing to pay for less robust data or at least less features and options partly (or wholly) due to their product experience and product presentation.

Redesigning Your Experience

You might already have some of these but it’s worthwhile to revisit a very basic style guide (excluding audience development):

  • Consistent logo and colors
  • Fonts
  • Vocabulary and Language Style (the tone of your brand, is it My Brand or MyBrand or myBrand, etc)

Some Additional Resources

Here are some visual/text-based resources that I have found helpful during my own redefining process:

These are some of the tools you might want to use to help in this process:

  • Running copy through Word for readability Scores- Office 2013
  • A Windows tool that can help improve your writing- Stylewriter
  • A Mac tool to help with graphics and charts- Omnigraffle
  • A Windows tool to help with charts and graphics- SmartDraw
  • A cloud-based presentation tool that helps the less artistically inclined (like me)- Prezi
  • Online proposal software- Proposable
  • A text expander for Mac, comes in handy with consistent “messaging”- TextExpander
  • Windows alternative that syncs with TextExpander- Breevy
Categories: 

Historical Revisionism

A stopped clock is right two times a day.

There’s some amusing historical revisionism going on in SEO punditry world right now, which got me thinking about the history of SEO. I’d like to talk about some common themes of this historical revision, which goes along the lines of “what I predicted all those years ago came true – what a visionary I am! .” No naming names, as I don’t meant this to be anything personal – as the same theme has popped up in a number of places – just making some observations :)

See if you agree….

Divided We Fall

The SEO world has never been united. There are no industry standards and qualifications like you’d find in the professions, such as being a doctor, or lawyer or a builder. If you say you’re an SEO, then you’re an SEO.

Part of the reason for the lack of industry standard is that the search engines never came to the party. Sure, they talked at conferences, and still do. They offered webmasters helpful guidelines. They participated in search engine discussion forums. But this was mainly to do with risk management. Keep your friends close, and your enemies closer.

In all these years, you won’t find one example of a representative from a major search engine saying “Hey, let’s all get together and form an SEO standard. It will help promote and legitimize the industry!”.

No, it has always been decrees from on high. “Don’t do this, don’t do that, and here are some things we’d like you to do”. Webmasters don’t get a say in it. They either do what the search engines say, or they go against them, but make no mistake, there was never any partnership, and the search engines didn’t seek one.

This didn’t stop some SEOs seeing it as a form of quasi-partnership, however.

Hey Partner

Some SEOs chose to align themselves with search engines and do their bidding. If the search engine reps said “do this”, they did it. If the search engines said “don’t do this”, they’d wrap themselves up in convoluted rhetorical knots pretending not to do it. This still goes on, of course.

In the early 2000’s, it turned, curiously, into a question of morality. There was “Ethical SEO”, although quite what it had to do with ethics remains unclear. Really, it was another way of saying “someone who follows the SEO guidelines”, presuming that whatever the search engines decree must be ethical, objectively good and have nothing to do self-interest. It’s strange how people kid themselves, sometimes.

What was even funnier was the search engine guidelines were kept deliberately vague and open to interpretation, which, of course, led to a lot of heated debate. Some people were “good” and some people were “bad”, even though the distinction was never clear. Sometimes it came down to where on the page someone puts a link. Or how many times someone repeats a keyword. And in what color.

It got funnier still when the search engines moved the goal posts, as they are prone to do. What was previously good – using ten keywords per page – suddenly became the height of evil, but using three was “good” and so all the arguments about who was good and who wasn’t could start afresh. It was the pot calling the kettle black, and I’m sure the search engines delighted in having the enemy warring amongst themselves over such trivial concerns. As far as the search engines were concerned, none of them were desirable, unless they became paying customers, or led paying customers to their door. Or perhaps that curious Google+ business.

It’s hard to keep up, sometimes.

Playing By The Rules

There’s nothing wrong with playing by the rules. It would have been nice to think there was a partnership, and so long as you followed the guidelines, high rankings would naturally follow, the bad actors would be relegated, and everyone would be happy.

But this has always been a fiction. A distortion of the environment SEOs were actually operating in.

Jason Calacanis, never one to miss an opportunity for controversy, fired some heat seekers at Google during his WebmasterWorld keynote address recently…..

Calacanis proceeded to describe Cutts and Google in terms like, “liar,” “evil,” and “a bad partner.” He cautioned the PubCon audience to not trust Google, and said they cooperate with partners until they learn the business and find a way to pick off the profits for themselves. The rant lasted a good five minutes….

He accused Google of doing many of the things SEOs are familiar with, like making abrupt algorithm changes without warning. They don’t consult, they just do it, and if people’s businesses get trashed as a result, then that’s just too bad. Now, if that’s a sting for someone who is already reasonable wealthy and successful like Calacanis, just imagine what it feels like for the much smaller web players who are just trying to make a living.

The search business is not a pleasant environment where all players have an input, and then standards, terms and play are generally agreed upon. It’s war. It’s characterized by a massive imbalance of power and wealth, and one party will use it to crush those who it determines stands in its way.

Of course, the ever pleasant Matt Cutts informs us it’s all about the users, and that’s a fair enough spin of the matter, too. There was, and is, a lot of junk in the SERPs, and Mahalo was not a partner of Google, so any expectation they’d have a say in what Google does is unfounded.

The take-away is that Google will set rules that work for Google, and if they happen to work for the webmaster community too, well that’s good, but only a fool would rely on it. Google care about their bottom line and their projects, not ours. If someone goes out of business due to Google’s behaviour, then that’s of no concern. Personally, I think the big technology companies do have a responsibility beyond themselves to society, because the amount of power they are now centralising means they’re not just any old company anymore, but great vortexes that can distort entire markets. For more on this idea, and where it’s all going, check out my review of “Who Owns The Future” by Jaron Lanier.

So, if you see SEO as a matter of playing by their rules, then fine, but keep in mind “those who can give you everything can also take everything away”. Those rules weren’t designed for your benefit.

Opportunity Cost

There was a massive opportunity cost by following so called ethical SEO during the 2000s.

For a long time, it was relatively easily to get high rankings by being grey. And if you got torched, you probably had many other sites with different link patterns good to go. This was against the webmaster guidelines, but given marketing could be characterized as war, one does not let the enemy define ones tactics. Some SEOs made millions doing it. Meanwhile, a lot of content-driven sites disappeared. That was, perhaps, my own “a stopped clock is right two times a day” moment. It’s not like I’m going to point you to all the stuff I’ve been wrong about, now is it :)

These days, a lot of SEO is about content and how that content is marketed, but more specifically it’s about the stature of the site on which that content appears. That’s the bit some pundits tend to gloss over. You can have great content, but that’s no guarantee of anything. You will likely remain invisible. However, put that exact same content on a Fortune 500 site, and that content will likely prosper. Ah, the rich get richer.

So, we can say SEO is about content, but that’s only half the picture. If you’re a small player, the content needs to appear in the right place, be very tightly targeted to your audiences needs so they don’t click back, and it should be pushed through various media channels.

Content, even from many of these “ethical SEOs”, used to be created for search engines in the hope of netting as many visitors as possible. These days, it’s probably a better strategy to get inside the audience’s heads and target it to their specific needs, as opposed to a keyword, then get that content out to wherever your audience happens to be. Unless, of course, you’re Fortune 500 or otherwise well connected, in which case you can just publish whatever you like and it will probably do well.

Fair? Not really, but no one ever said this game was fair.

Whatever Next?

Do I know what’s going to happen next? In ten years time? Nope. I could make a few guesses, and like many other pundits, some guesses will prove right, and some will be wrong, but that’s the nature of the future. It will soon make fools of us all.

Having said that, will you take a punt and tell us what you think will be the future of SEO? Does it have one? What will look like? If you’re right, then you can point back here in a few years time and say “Look, I told you so!”.

If you’re wrong, well, there’s always historical revisionism :)

Categories: 

Optimizing The SEO Model

SEO has always been focused on acquisition.

The marketing strategy, based on high rankings against keyword terms, is about gaining a steady flow of new visitors. If a site ranks better than competing sites, this steady stream of new visitors will advantage the top sites to the disadvantage of those sites beneath it.

The selling point of SEO is a strong one. The client gets a constant flow of new visitors and enjoys competitive advantage, just so long as they maintain rank.

A close partner of SEO is PPC. Like SEO, PPC delivers a stream of new visitors, and if you bid well, and have relevant advertisements, then you enjoy a competitive advantage. Unlike PPC, SEO does not cost per click, or, to be more accurate, it should cost a lot less per click once the SEOs fees are taken into account, so SEO has enjoyed a stronger selling point. Also, the organic search results typically have a higher level of trust from search engine users.

91% prefer using natural search results when looking to buy a product or service online”.[Source: Tamar Search Attitudes Report, Tamar, July 2010]

Rain On The Parade

Either by coincidence or design, Google’s algorithm shifts have made SEO less of a sure proposition.

If you rank well, the upside is still there, but because the result is less certain than it used to be, and the work more involved than ever, the risk, and costs in general, have increased. The more risky SEO becomes in terms of getting results, the more Adwords looks attractive, as at least results are assured, so long as spend is sufficient.

Adwords is a brilliant system. For Google. It’s also a brilliant system for those advertisers who can find a niche that doesn’t suffer high levels of competition. The trouble is competition levels are typically high.

Because competition is high, and Adwords is an auction model, bid prices must rise. As bid prices rise, only those companies that can achieve ROI at high costs per click will be left bidding. The higher their ROI, the higher the bid prices can conceivably go. Their competitors, if they are to keep up, will do likewise.

So, the PPC advertiser focused on customer acquisition as a means of growing the company will be passing more and more of their profits to Google in the form of higher and higher click prices. If a company wants to grow by customer acquisition, via the search channel, then they’ll face higher and higher costs. It can be difficult to maintain ROI via PCC over time, which is why SEO is appealing. It’s little wonder Google has their guns pointed at SEO.

A fundamental problem with Adwords, and SEO in general, is that basing marketing success around customer acquisition alone is a poor long term strategy.

More on that point soon….

White-Hat SEO Is Dead

It’s surprising a term such as “white hat SEO” was ever taken seriously.

Any attempt to game a search engine’s algorithm, as far as the search engine is concerned, is going to be frowned upon by the search engine. What is gaming if it’s not reverse engineering the search engines ranking criteria and looking to gain a higher rank than a site would otherwise merit? Acquiring links, writing keyword-focused articles, for the purpose of gaining a higher rank in a search engine is an attempt at rank manipulation. The only thing that varies is the degree.

Not that there’s anything wrong with that, as far as marketers are concerned.

The search marketing industry line has been that so long as you avoided “bad behaviour”, your site stood a high chance of ranking well. Ask people for links. Find keywords with traffic. Publish pages focused on those topics. There used to more certainty of outcome.

If the outcome is not assured, then so long as a site is crawlable, why would you need an SEO? You just need to publish and see where Google ranks you. Unless the SEO is manipulating rank, then where is the value proposition over and above simply publishing crawlable content? Really, SEO is a polite way of saying “gaming the system”.

Those who let themselves be defined by Google can now be seen scrambling to redefine themselves. “Inbound marketers” is one term being used a lot. There’s nothing wrong with this, of course, although you’d be hard pressed to call it Search Engine Optimization. It’s PR. It’s marketing. It’s content production. The side effect of such activity might be a high ranking in the search engines (wink, wink). It’s like Fight Club. The first rule of Fight Club is……

A few years back, we predicted that the last SEOs standing would be blackhat, and that’s turned out to be true. The term SEO has been successfully co-opted and marginalized. You can still successfully game the system with disposable domains, by aggressively targeting keywords, and buying lot of links and/or building link networks, but there’s no way that’s compliant with Google’s definitions of acceptable use. It would be very difficult to sell that to a client without full disclosure. Even with full disclosure, I’m sure it’s a hard sell.

But I digress….

Optimization In The New Environment

The blackhats will continue on as usual. They never took direction from search engines, anyway.

Many SEOs are looking to blend a number of initiatives together to take the emphasis off search. Some call it inbound. In practice, it blends marketing, content production and PR. It’s a lot less about algo hacking.

For it to work well, and to get great results in search, the SEO model needs to be turned on its head. It’s still about getting people to a site, but because the cost of getting people to a site has increased, every visitor must count. For this channel to maintain value, then more focus will go on what happens after the click.

If the offer is not right, and the path to that offer isn’t right, then it’s like having people turn up for a concert when the band hasn’t rehearsed. At the point the audience turns up, they must deliver what the audience wants, or the audience isn’t coming back. The bands popularity will quickly fade.

This didn’t really matter too much in the past when it was relatively cheap to position in the SERPs. If you received a lot of slightly off-topic traffic, big deal, it’s not like it cost anything. Or much. These days, because it’s growing ever more costly to position, we’re increasingly challenged by the “growth by acquisition” problem.

Consider optimizing in two areas, if you haven’t already.

1. Offer Optimization

We know that if searchers don’t find what they what, they click back. The click back presents two problems. One, you just wasted time and money getting that visitor to your site. Secondly, it’s likely that Google is measuring click-backs in order to help determine relevancy.

How do you know if your offer is relevant to users?

The time-tested way is to examine a couple of the 4ps. Product, price, position, and place. Place doesn’t matter so much, as we’re talking about the internet, although if you’ve got some local-centric product or service, then it’s a good idea to focus on it. Promotion is what SEOs do. They get people over the threshold.

However, two areas worth paying attention to are product and price. In order to optimize product, we need to ask some fundamental questions:

  • Does the customer want this product or service?
  • What needs does it satisfy? Is this obvious within a few seconds of viewing the page?
  • What features does it have to meet these needs? Are these explained?
  • Are there any features you’ve missed out? Have you explained all the features that meet the need?
  • Are you including costly features that the customer won’t actually use?
  • How and where will the customer use it?
  • What does it look like? How will customers experience it?
  • What size(s), color(s) should it be?
  • What is it to be called?
  • How is it branded?
  • How is it differentiated versus your competitors?
  • What is the most it can cost to provide, and still be sold sufficiently profitably?

SEOs are only going to have so much control over these aspects, especially if they’re working for a client. However, it still pays to ask these questions, regardless. If the client can’t answer them, then you may be dealing with a client who has no strategic advantage over competitors. They are likely running a me-too site. Such sites are difficult to position from scratch.

Even older sites that were at one point highly differentiated have slid into an unprofitable me too status as large sites like Amazon & eBay offer a catalog which grows deeper by the day.

Unless you’re pretty aggressive, taking on me-too sites will make your life difficult in terms of SEO, so thinking about strategic advantage can be a good way to screen clients. If they have no underlying business advantage, ask yourself if you really want to be doing SEO for these people?

In terms of price:

  • What is the value of the product or service to the buyer?
  • Are there established price points for products or services in this area?
  • Is the customer price sensitive? Will a small decrease in price gain you extra market share? Or will a small increase be indiscernible, and so gain you extra profit margin?
  • What discounts should be offered to trade customers, or to other specific segments of your market?
  • How will your price compare with your competitors?

Again, even if you have little or no control over these aspects, then it still pays to ask the questions. You’re looking for underlying business advantage that you can leverage.

Once we’ve optimized the offer, we then look at conversion.

2. Conversion Optimization

There’s the obvious conversion most search marketers know about. People arrive at a landing page. Some people buy what’s on offer, and some leave. So, total conversions/number of views x 100 equals the conversion rate.

However, when it comes to SEO, it’s not just about the conversion rate of a landing page. Unlike PPC, you don’t have precise control over the entry page. So, optimizing for conversion is about looking at every single page on which people enter your site, and optimizing each page as if it were an entry point.

What do you want people to do when they land on your page?

Have a desired action in mind for every page. It might be a sign-up. It might be to encourage a bookmark. It might be to buy something. It might be to tweet. Whatever it is, we need to make the terms of engagement, for the visitor, clear for each page – with a big, yellow highlight on the term “engagement”! Remember, Google are likely looking at bounce-back rates. So, there is a conversion rate for every single page on your site, and they’re likely all different.

Think about the shopping cart process. Is a buyer, particularly a mobile buyer, going to wade through multiple forms? Or could the sale be made in as few clicks as possible? Would integrating Paypal or Amazon payments lift your conversion rates? What’s your site speed like? The faster, the better, obviously. A lot of conversion is about streamlining things – from processes, to navigation to site speed.

At this point, a lot of people will be wondering how to measure and quantify all this. How to track track conversion funnels across a big site. It’s true, it’s difficult. It many cases, it’s pretty much impossible to get adequate sample sizes.

However, that’s not a good reason to avoid conversion optimization. You can measure it in broad terms, and get more incremental as time goes on. A change across pages, a change in paths, can lead to small changes on those pages and paths, even changes that are difficult to spot, but there is sufficient evidence that companies who employ conversion optimization can enjoy significant gains, especially if they haven’t focused on these areas in the past.

While you could quantify every step of the way, and some companies certainly do, there’s probably a lot of easy wins that can be gained merely by following these two general concepts – optimizing the offer and then optimizing (streamlining) the pages and paths that lead to that offer. If something is obscure, make it obvious. If you want the visitor to do something, make sure the desired action is writ-large. If something is slow, make it faster.

Do it across every offer, page and path in your site and watch the results.

Categories: 

How To Win In Local Internet Marketing: The Practical Guide For Small Businesses And Local Marketers

Once a training ground for novice SEOs, local search has evolved into a complex, unpredictable  ecosystem dominated by Google. Corporations and mom-and-pops shops alike are fighting for their place under the Sun. It’s everybody’s job to make best out of local Internet marketing because its importance will continue to grow.

This guide is geared towards helping you deepen your understanding of the local search ecosystem, as well as local Internet marketing in general.

I hope that, after you finish reading this guide, you will be able to make sense of local Internet marketing, use it to grow your business or help your clients do the same.

Objectives, Goals & Measurements Are Crucial

Websites exist to accomplish objectives. Regardless of company size, business models and market, your website needs to bring you closer to accomplishing one or more business objectives. These could be:

  1. Customer Acquisition
  2. Lead Generation
  3. Branding
  4. Lowering sales resistance
  5. etc.

Although not exciting, this is a crucial step in building a local Internet strategy. It will determine the way you set your goals, largely shape the functionality of your website, guide you in deciding what your budget should be and so on.

Getting Specific With Measurement

Objectives are too broad to work with. They exist on a higher level and are something company executives/leadership need to set.

This is why we need specific goals, KPIs and targets. Without getting into too many details, goals could be defined as specific strategies geared towards accomplishing an objective.

For example, if your objective is to “grow your law firm,” a good goal derived from that would be to “generate client inquiries”. Another one would be to use the website to get client referrals.

When you have all this defined, you need to set KPIs. They are simply metrics that help you understand how are you doing against your objectives.  For this imaginary law firm, a good KPI would be the number of potential client leads. After you set targets for your KPIs, you have completed your measurement framework. To learn more about measurement models, you can read this post by Avinash Kaushik.
These will be the numbers that you or your client should care about on a day to day basis.

Lifetime Customer Value And Cost Of Customer Acquisition

Regardless of size, every local business needs to know what is their average lifetime customer value and the cost of customer acquisition.

You need to know these numbers so you can set your marketing budget and be aware if you are on the path of going out of business despite acquiring lots of customers.

Lifetime customer value (LTV) is revenue you expect from a single customer during the lifetime of your business. If you are having trouble calculating this number for your or client’s business, use this neat calculator made by Harvard Business School.

Customer Acquisition Cost (CAC) is the amount of money you spent to acquire a single customer. The formula is simple. Divide the sum of total costs of sales, marketing, your overhead, with the number of customers you acquired in any given period.

LTV & CAC are the magic numbers.

You can use them to sell Internet marketing services, as well as to demonstrate the value of investing heavily in Internet marketing.

Understanding and using these metrics will put you and your clients ahead of most competitors.

Stop – It’s Budget Time

Now when you have your business objectives, customer acquisition costs and other KPIs defined, and their targets set, it’s time to talk budgets. Budgets will determine what kind of local Internet marketing campaign you can run and how far it can essentially go.

Most companies don’t have a separate Internet marketing budget. It’s usually just a part of their marketing budget which can be anywhere from 2% to 20% of sales depending on a lot of factors including, but not limited to:

  1. Business objectives
  2. Company size
  3. Profit margins
  4. Industry
  5. etc.

What does this mean to you?

If you are selling services, you will need to have as much of this data as possible.

Planning And Executing Your Campaign

Now when you know what business objectives your local Internet marketing campaign has to accomplish, your targets, and your budget – you can start developing a campaign. It’s easiest to think of this process if we break our campaign planning into small, but meaningful phases:

  1. laying the groundwork,
  2. building a website,
  3. taking care of your data in the local search ecosystem,
  4. citation building,
  5. creating a great website,
  6. building links,
  7. setting up a review management system,
  8. expanding on non-organic search channels
  9. and taking care of web analytics.

Laying The Groundwork


Local search is about data. It’s about aggregation and distribution of data across different platforms and technologies. It’s also about accuracy and consistency.

This is the reason why you need to start with a NAP audit.

NAP stands for name, address and phone number. It’s the anchor business data and should remain accurate, consistent and up to date everywhere. In order to make it consistent, you first need to identify inaccurate data.

This is easier than it sounds.

You can use Yext.com or Getlisted.org to easily and quickly check your data accuracy and consistency in the local search eco system.

Start With Data Aggregators

Data aggregators or compilers are companies that build and maintain large databases of business data. In the US, the ones you should keep an eye on are Neustar/Localeze , Infogroup (former InfoUSA) and Axciom.

Why are data aggregators important?
They are upstream data providers. This means that they provide baseline and sometimes enhanced data to search engines (including Google), local and industry directories. If your data is wrong in one of their databases, it will be wrong all over the place.

Usually, your business data goes bad for one or more of these reasons:

  1. You changed your phone number;
  2. You moved to another location;
  3. Used lots of tracking numbers
  4. Made lots of IYP advertising deals where you wanted to target multiple towns/cities
  5.  etc.

If you or your client have a data inconsistency problem, the fix will start with the aggregators:

Before you embark on a data correction campaign, have in mind that data aggregators take their data seriously. You will need to have access to the phone number on the listing you are trying to claim and verify, an email on the domain of the site associated with the business, and sometimes even scans of official documents.

Remember – after you fix your data inaccuracies with the aggregators, it’s still a smart idea to claim and verify listings in major IYPs as data moves slowly from upstream data providers to
numerous local search platforms your business is listed in.

Building Citations Is Important

Simply put, citations are mentions of your business’s name, address and phone number (full citation) or name and phone or address (partial citation).

Just like links in “general” organic search, citations are used to determine the relative importance or prominence of your business listing. If Google notices an abundance of consistent citations, it makes them think that your business is legitimate and important and you get rewarded with higher search visibility.

The more citations your business has, the more important it will be in Google’s eyes. Oh, there is also a little matter of citation quality as not all citations are created equal. There are also different types of citations besides full and partial.

Depending on the source, citations can come from:

  1. your website;
  2. IYPs like YellowPages.com;
  3. local business directories like Maine.com;
  4. industry websites like ThomasNet.com;
  5. event websites like Events.com;
  6. etc.

We could group citations by how structured they are. This means that a citation on YellowBook.com is structured, but a mention on your uncle’s blog is not. Google prefers the first type. The bulk of your citation building will be covered by simply making sure that your data in major data aggregators is accurate and up-to-date. However, there’s more to citations than that.

What Makes Citations Strong?

Conventional wisdom tells us that citation strength depends mostly on the algorithmic trust that Google has in the source of the your citation. For example, if you are a manufacturer of industrial coatings, a mention on ThomasNet.com would help you significantly more than a mention on a blog from some guy that has visited your facility once.

You also want your citations to be structured, relevant and to have a link to your website for maximum benefit.

How To Build Citations?

You already started by claiming and verifying your listings with major data aggregators. Since you are very serious about local search, you will make sure to claim and verify listings with major IYPs, too.

Start with the most important ones:

  1. Yellowpages.com;
  2. Yelp.com;
  3. local.yahoo.com;
  4. SuperPages.com;
  5. Citysearch.com;
  6. Insiderpages.com;
  7. Manta.com;
  8. Yellowbook.com;
  9. Yellowbot.com;
  10. Local.com;
  11. dexknows.com;
  12. MerchantCircle.com;
  13. Hotfrog.com;
  14. Mojopages.com;
  15. Foursquare.com;
  16. etc.

You shouldn’t forget business and industry associations such as bbb.org or your local chamber of commerce. Here’s where you can find your local chamber of commerce.

Industry Directories Are An Excellent Source Of Citations

Industry directories such as Avvo.com for lawyers or ThomasNet.com for manufacturers are not just an excelent source of citations, but are great for your organic search visibility in the Penguin Apocalipse.

How do you find those ?

You can use a couple of tools:

Want even more citations?

Then pay attention to daily deal and event sites. Don’t forget charity websites either. If you are one of those people that are obsessed with how everything about citations works, I recommend this (the one and only) book/guide about citations by Nyagoslav Zhekov.

Make Your Website Great

While it’s possible to achieve some success using just Google Places and other platforms to market a local business, it’s not possible to capture all the Web has to offer.

Your website is the only web property you will fully control. You have the freedom to track and measure anything you want, and the freedom to use your website to accomplish any business objective.

Marry Keyword And Market Research

There’s nothing more tragic nor costly than targeting the wrong keywords and trying to appeal to demographics that don’t need your services/products.

To run a successful local Internet marketing campaign, you cannot just rely on quantitative data (keywords), you need to conduct qualitative market research. This is very important as it will reduce your risks, as well as acquisition costs if done right.

Let’s start with keyword research.

Getting local keyword data has always been a challenge. Google’s recent decision to withhold organic keyword data hasn’t made it any easier. However, Google itself has provided us with tools to get relatively reliable keyword data for any local search campaign.

Coupled with data from SEOBook Keyword Tool, Ubersuggest, and Bing’s Keyword Tool, you will have plenty of data to work with.

Of course, you shouldn’t forsake the market research of the equation.

You and/or your client can survey their customers to discover how exactly they describe your business, your services/products or your geographic area. For example, you’ll learn if there are any geographical nuances that you should be aware of, such as:

  1. DFW (Dallas/Fort Worth)
  2. PDX (Portland)
  3. OBX (Outer Banks)

Use this data against keyword research tools. If you’re running AdWords, you can get an accurate idea of search volumes. To do that, click the Campaign tab, followed by the Keywords tab, then Details and then Search Terms. This data can be downloaded. The video below shows how you can get accurate search volume data if running AdWords.

Keep in mind that the quality of data using this method depends on your use of keyword matching options. This practically means that if you want to get exact match search volumes for a certain number of keywords, you have to make sure to have those keywords set as exact match.

If you’re not running AdWords, Google gives you a chance to get a good representation of your local search market using the Keyword Planning Tool as described in this post.

Content And Site Architecture

Largely, your content will depend on your business objectives, brand and the results of your keyword research. The time of local brochure type sites has long passed, at least for businesses that are serious about local Internet marketing.

Local websites are no different from corporate websites when it comes to technical aspects of SEO. Performance and crawlability are very important, as well as proper optimization of titles, headings, body text etc.

However, unlike corporate websites, local sites will have more benefit from:

  1. “localization” of testimonials – it’s not only important to get testimonials, but it’s crucial to make sure that your visitors know where those testimonials came from.
  2. “localization” of galleries, as well as “before and after” photos – similar to testimonials, you can leverage social proof the most if your website visitors can see how your services/products helped their neighbours.
  3. location pages – pages about a specific city/town where you or your client have an office or service area. Before you go on a rampage creating hundreds of these pages, don’t forget that they need to add value to the users, and not just be copy/pasted from Wikipedia. The way to add value is to make them completely unique and useful to your visitors. For example, location pages can show the specific directions to one of your offices or store-fronts. You don’t have the “big brand luxury”  of ranking local pages that have virtually all of their content behind a paid wall. 

  1. local blogging – use your blog to connect with local news organizations, charities and industry associations, as well as local bloggers. In addition, blog about your industry; this way, you will get the best of both worlds.
  2. adopting structured data – using schema markup, you can increase click-through rates from the SERPs and get a few other SEO benefits. You can use the Schema Creator to save time.
  3. adopting “mobile” – everyone knows that local search is increasingly mobile. Mobile websites are not a luxury but a necessity Luckily for you or your clients you don’t have to invest a lot of resources in developing a mobile site. You can use tools such as dudamobile.com or bmobilized.com to create a fully functional mobile website in hours.

Link Building For Brick And Mortar Businesses

Links are still important. They are still a foundation of high organic search visibility. They still demand your resources.

But a lot has changed – since Penguin. Building links has become a delicate endeavor even for local websites. But there is a way to triumph, all you need to do is change how you view local link building.

See link building as marketing campaigns that have links as a by-product.

What does that mean? It means that your are promoting your business as if Google doesn’t exist. Link and citation building overlap to a certain extent. They do so in a way that makes good links great citations, especially if they’re structured.

Join Business Associations

BBB.org has an enormus amount of algorithmic trust. It’s also an excellent citation. As a bonus – displaying the BBB badge prominently on your website you will likely receive a boost in conversion rates. Similar is true with your local chamber of commerce. Would you join those if Google was not around?

You probably would.

Join Industry Associations

Every industry has associations you or your client can join. You will get similar benefits to ones one can expect from BBB. However, being a member of  trade associations will add an additional layer of value to your business in form of education or certifications.

Charity work

Every business should give back.  Sometimes you will get a link sometimes you will not but you will always benefit from this type of community involvement.

Industry websites

There are plenty of industry websites and and directories in  almost every industry. Sometimes these websites can refer significant traffic to you but they almost always make for a good link and a solid citation.

Organize Events

Events are good for business. If you organize them you should make sure that it’s reflected on the web. There are plenty of websites you can submit your event to. Google is not likely  to start considering organizing offline events spam any time soon.

Find Local Directories

Every state has a few good ones. It’ likely that your town has an  online business directory you can join. These types of links can make good citations too. They are usually easy to acquire.

Local Blogs

It pays to a friend of your “local blogosphere”.  Try to include local bloggers in your community involvement, offer to contribute content or offer giveaways.

Truly Integrate Link Building Into Your Marketing Operations

Whenever possible, make sure your vendors link to you:

  1. If you’re offering discounts to any organization, make sure it’s reflected on their website.
  2. If you’re attending an industry show or an event, give a testimonial and get a link.
  3. If you get press, remind a report to link to your website.

Review Management

In local search, customer reviews are bigger than life. Consumers trust online reviews as much as personal recommendations while majority (52%) says that positive online reviews make them more likely to choose a local business. Influence reviews have on your local business go well beyond social proof. Good reviews can boost your local search visibility, while bad reviews can destroy your business.

Reviews – The Big Picture

Every organization that strives to get better at what it does should use consumer reviews to improve its business operations. Customer reviews should be treated as one of the most valuable pieces of qualitative data. You should be surveying your customers daily and use their feedback to improve your services, products, customer service etc..

This holds true for corporations, as well as mom and pops shops. It’s not complicated to ask your customers about specific aspects of their experience with your business and record their answers. It’s not expensive, either.

The benefits of taking reviews seriously are enormous:

  1. More search visibility;
  2. Less potential for online reputation management issues;
  3. Increased Credibility;

 What can you do to win at review management?

Since you need to get high rating positive reviews on different websites in a way that doesn’t break any guidelines and keeps you out of jail, your best bet would be to use reviews as a customer service survey tools.

This means that you should seek customer feedback systematically in order to improve your or your client’s business. You can ask your most ecstatic customers to share their experiences with your services/products on major local search platforms. Remember that you cannot provide any type of incentive for this behavior.

To save time, you can use a tool such as GetfiveStarts.com. This tool will do everything described above.

Think Beyond Organic Search

Internet marketers tend to be blindly focused on organic search. It’s understandable – organic traffic is relatively cheap (in most markets) and seemingly unlimited.

It’s also a mistake.

Organic search channel is getting increasingly more unstable. And with that, more expensive to acquire. Since you’re aware of your customer acquisition cost and have a measurement framework, it’s easy to know how affordable traffic from other sources is for your business.

Paid Search Traffic

Paid search advertising works, especially if you did a good job gearing your site for conversion. You shouldn’t leave your PPC budget to Google, though. Bing/Yahoo! are a more affordable source of paid traffic with similar conversion rates.

If you’re planning to run a local paid campaign, don’t forget to:

  1. target geographically;
  2. use negative keywords and
  3. be fanatical about acquisition cost.

You can also read this post by PPC Hero on what you should keep in mind when running local search advertising campaigns. You can also check out this post on Search Engine Land about managing and measuring local PPC campaigns.

Internet Yellow Pages (IYPs) Sites

Sites like YellowPages.com or SuperPages.com don’t have the traffic Google or even Bing get, but they do have a significant amount of traffic. They also have traffic that’s at the very end of the buying cycle. This is the reason one should be serious about IYPs.

What does that mean?

It means that you should have most of the big IYP listings claimed, verified and optimized to the best of your ability. So use every element of your listing to sell your products/services. In a lot of markets, it’s wise to explore advertising opportunities, as well.

If you want to take an extra step, or simply lack the time, you can sign up with a service such as Yext.com and control the major IYP listings from a single dashboard.

Keep in mind, though, that Yext.com doesn’t come for free, and you will have to pay a few hundreds dollars for a year of service.

Another avenue to take would be to outsource this process. In this scenario, you will most likely pay a one-time fee for verification and optimization of a predetermined number of listings. However, if you would like to change some of your business information somewhere down the road (such as name and phone number), you will have to go through this process from the beginning.

Social Media

These days, social media means a lot of things to a lot of different people. Local businesses should use social media platforms to connect with customers that love them. Empowering these customers and giving them an incentive to recommend you to their family and friends.

You should automate as much of your social media efforts as possible. You can use tools like HooteSuite or SocialOomph.

Always try to add value in your interactions and never spam your follower base.

Classified Sites

It’s amazing how many businesses miss to build their presence on classified sites like Craigslist.org. Even though Craigslist audience the type of audience that is always on the lookout for a great deal, the buying intent is very strong.

If you’d like to get the most out of Craigslist and other classified sites, remember to make your ads count. You need:

  1. persuasive copy;
  2. targeted ads;
  3. special deals;
  4. etc.

Other sources of non-search traffic you should explore are local newspaper advertising, ads on big industry websites, local blogs and others.

Tracking And Web Analytics

If there’s only one thing local businesses should care about, it’s tracking. As we established in the beginning of this guide, everyone needs to know how much they can afford to spend in order to acquire a customer.

Proper tracking ensures that you don’t make a mistake of spending too much on customer acquisition or spending anything on acquiring a wrong type of customer.

You can use a number of free or low cost web analytics solutions, including Clicky, KissMetrics, Woopra and Google Analytics.

If you’re like most people and don’t care if Google has access to your data, you can use Google Analytics. Take advantage of custom reporting and advanced segmentation.

In order to make the most out the traffic you get, and to get more of the traffic that is right for your business, you should create custom reports. They will enable you to know how you’re doing against your targets.

To create a custom report, click the “Customization” tab in Analytics and then click the “New Custom Report” tab.

Pick your metrics first (I recommend a Unique Visitors and Conversion Rates and couple that with the geographic dimension)

Tracking Offline Conversions

This step is crucial for local businesses that want to measure performance. Fortunately, this is not as complicated as it sounds. Depending on the type of your campaign, you can use tracking phone numbers, web-only discount codes as well as campaign-specific URLs.

Avinash Kaushik has written extensively on best ways to track offline conversions. I highly recommend this post.

Tying It All Together

Focus on improving the quality of products you sell and/or services you provide. Remember that every Internet marketing campaign works better if you’re able to provide a remarkable experience for your customers.

Build your brand and make your customers fall in love with your business. That would make every aspect of your marketing, especially Internet marketing, work better.


Vedran Tomic is a member of SEOBook and founder of Local Ants LLC, a local internet marketing agency.

Categories: 

Productizing Your SEO Business

If you service clients, it’s quite likely that you’ve faced some of the same pain points I have when trying to design a “product” out of your “service”. The words product and service in our industry tend to be interchangeable as our products are digital products.

Pricing for SEO, or any type of digital marketing service, has been written about quite a few times and there’s never been a real clear answer as to what the sweet spot is for pricing.

I actually do not believe there is a clear or semi-clear answer to pricing but what I do believe is that there is a clear path you can set for your company which makes many aspects of your business easier to automate and easier to manage. I refer to it here as “productizing” the business.

Where to Start

Some products can be priced more easily than others. If you are selling just your time (consulting) then you can do it by hour, obviously. I think the “future” of the SEO consultant has been here for awhile anyways. Many have already evolved into the broader areas of digital marketing like:

  • Technical SEO
  • CRO
  • Competitive Research
  • Analytics
  • Broader Online Marketing Strategy and Execution

There are other areas like paid search, email marketing, and so on but the above covers a good chunk of what many of us having been doing on our own properties for awhile and client sites as well. As more and more of us service clients and perhaps start agencies it’s important to start from the beginning.

This will differ in analysis if you have a much larger agency, but here we are focusing on the more common freelancer and small agency. The steps I would recommend are as follows (this is in relation to pricing/products only, I’m assuming you’ve already identified your market, brand messaging, etc):

  • Determine a sustainable net profit. What do I want to earn as a baseline number?
  • Determine acceptable margins based on desired size of staff and potential cost of contractor work.
  • Determine the required gross revenue needed to achieve your net profit.

Why Do it This Way?

I do it this way because net margin is very important to me. I don’t want to become the Walmart of digital marketing where our margins become paper thin as volume goes up.

Here is an example of what I mean. Consider the following scenario:

I’m leaving my job as a dairy farmer here in rural Rhode Island and I want to make $1,500,000 per year.

So, you’re going to pay a little bit more assuming you are a single member LLC versus a traditional W-2 “employee” (again, keeping it very simple) because of the self-employment tax. Your CPA can go over the different options based on your business set up and such but the base calculations are the same as far as determining the core numbers go.

If you just look at just “earnings” you are missing the bigger picture. What you should want to achieve for short, mid, and long term viability are healthy margins. Here’s an example:

Jack’s SEO Shop had a net income of $1,000,000 dollars in 2011. Their overall sales were $5,000,000. In 2012 they had $1,500,000 in net income with $10,000,000 in sales.

Jill’s SEO Shop had net income of $500,000 dollars in 2011. Their overall sales were $2,000,000. In 2012 they had $1,500,000 in net income with $4,000,000 in sales.

In this case we look at a basic calculation of profit margin (net income/gross sales) and see that:

  • Jacks’ 2011 profit margin was 20%
  • Jack’s 2012 profit margin 15%
  • Jill’s 2011 profit margin 25%
  • Jill’s 2012 profit margin 38% (same net income as Jack)

Certainly 15% on 10 million isn’t something to necessarily sneeze at but I’d much rather be Jill in the current state of web marketing. A 38% profit margin does so much more for your overall viability as a company when you take into account being able to respond to competition, algorithmic changes, increased cost of quality labor, and so on.

In this example a conversation about simply “making” 1.5 million per year is quite misleading. Once we have these numbers figured out we can begin to “design” our “products and/or services” to somewhat fit a pricing model by backdooring it via preferred margins.

Setting Up Your Products

Many folks in the industry have had exposure and direct experience with a number of disciplines. At the very least, a lot of us know enough about “how” to execute a particular type of service without maybe the specific knowledge of how to go in and “push the buttons”.

There’s a tendency to do all types of service but a good way to start is to look at your core competencies and determine what makes the most sense to offer as a product. If you are just starting out you can start this from a blank slate, there’s not a big difference either way.

You will run across a couple different types of costs, direct and indirect. Let’s assume for the sake of simplicity you are a freelancer or just a solo operation. In terms of selling a service you will have 2 core types of cost:

  • Direct (utilization of outside contractors to accomplish a task)
  • Indirect (your time and any other overhead like office costs, insurance, tools, marketing costs)

There’s some debate as to whether you should include the estimated cost of your marketing as part of a per project cost to accurately determine your margins. I say why not, using it only makes it more accurate in terms of hard numbers.

Perhaps you whittled down your offerings to:

  • Technical SEO Audits
  • SEO Competitive Analysis Audits
  • Conversion Optimization
  • Content Marketing

We can assume that you might have the following tools in your toolbelt:

  • Screaming Frog SEO Spider (roughly 158$ per year if you are in the US)
  • Majestic SEO subscription (roughly $588 per year for the Silver plan)
  • Ahrefs subscription (roughly $948 per year for the Pro subscription)
  • Visual Website Optimizer subscription ($588 per year for the Small Business Plan)
  • Raven SEO Tools for competitive research, content marketing strategy and execution, SEO audit work ($1,188 per year)
  • Buzzstream for outreach and additional link prospecting ($1,188 per year)

There are more tools we could add but at a baseline level you would be able to produce quality products with these tools. Total cost is $4,658 per year or $389 (rounded up, per month).

The same formula (annual and monthly amounts) would be used for any other overhead you deem necessary but for the sake of simplicity let’s say you are spending $389 per month on “stuff”.

Knowledge + Tools = Win

Tools are only 1 part of a 2 part equation. Tools without knowledge are useless. There are a variety of costs one could associate with knowledge acquisition:

  • Building your own test sites
  • Going to conferences
  • Participating in online membership sites

The costs for knowledge acquisition can vary from person to person. You might be at a point where all three make sense or at a level where only 1 or 2 make sense. I would recommend looking at these options relative to your skill set and determining the cost, annually, of what makes sense for you. Take that number and just add it to the example cost I gave for tools I recommended earlier.

Breaking Out a Product List

The next step would be to look at each type of service you are offering and productize it. The first 2 areas are more likely to be your time only versus your time + outside contractor help. Conversion Optimization and Content Marketing will probably incur additional costs outside of your time for things like:

  • User testing
  • Content writing
  • Content design
  • Promotion help
  • Programming for interactive content

When setting up products I use this:

  • GI is Gross Income
  • Tax is GI * (whatever your total tax percentage is)
  • NI is Net Income
  • GM is Gross Margin (E2/B2)
  • NM is Net Margin (G2/B2)

In that example I used $150 as my hourly rate and assumed 40 hours for an audit. Now I can play around with the direct cost and price to arrive at the margins I am looking for.

One thing to keep in mind with indirect cost is usually it’s something that can be divided amongst your current projects.

So I might revisit my pricing table from time to time to revamp the indirect cost based on my current client list. In this example I assume no clients are currently onboard and no income for my own properties so this audit eats up all the indirect cost against its margins.

You can design your products however it works for you but I usually try to find some type of baseline that works for me. In the areas I assumed earlier I would try to make sub-products out of each section:

  • Audit based on size and scope of site (total pages, ecommerce, dynamic, etc)
  • Conversion Rate Optimization based on total hours for ongoing work and a few different prices for the initial audit and feedback
  • Content Marketing based on the scale needed broken out into different asset types for easier pricing (videos, interactive content, infographics, whitepapers, and so on
  • SEO Competitive Analysis based on total hours needed for ongoing work and different prices based on the scope of the initial research (or just a one-off overview)

There are so many variables to each service that it is impossible to list them here but the general ideas remain the same. Start with a market and break them out into “things” that can be sold which cover “most” of your target market.

Manage Your Workloads More Efficiently

One of the reasons I mentioned direct cost as being your hourly rate is so you can set a baseline of how many hours you want to work per month to achieve the amount you’d like to earn. Combining what you want to earn with the hours you want to work will help you work out a minimum hourly rate which you can adjust up or down, along with desired revenue, to hit your pricing sweet spot.

Using your hourly rate in conjunction with designing specific products makes it pretty easy to assign hours required to a specific product. When you assign hours to each product you can do a few things that will help in managing your workload:

  • When a new project is being quoted you can quickly gauge whether, based on current projects in process, you have availability for the project
  • If you know ahead of time you are stretched out a bit and need to bring in outside help you can add those additional costs to your proposal and get outside help ready ahead of time
  • If you take on projects and you find your assumed hours are over or under the amount really necessary you can adjust that for future projects

Assigning your required hours to each product you sell will help you manage your workload better and give you more fluidity during peak times. Inevitably there will be periods of peaks and valleys in the demand for your service so if you are able to manage the peaks in a less stressful and more profitable manner the valleys might not be as deep for your financially.

Other Areas Where Productizing Helps

Custom quoting everything that comes through the door is a pain point for me.

Post-quoting you have things like contracts that have to get signed, billing that has to get set up, and task processes that have to get accomplished.

When you have specific products you are selling, it becomes much easier to automate:

  • Proposal templates that get sent out
  • Contract documents
  • Billing setup
  • New client onboarding into a CRM/PM system
  • Tasks that need to be completed and assigned
  • Setting up classes and jobs in Quickbooks to track financials per client or per job

It can be a pretty lengthy process but making your services into products really helps your business in a number of areas

Categories: 

Time For A Content Audit

“Content is king” is one of those “truthy” things some marketers preach. However, in most businesses the bottom line is king, attention is queen, and content can be used as a means to get both, but it depends.

The problem is that content is easy to produce. Machines can produce content. They can tirelessly churn out screeds of content every second. Even if they didn’t, billions of people on the internet are perfectly capable of adding to the monolithic content pile at similar rates.

Low barriers to content production and distribution mean the internet has turned a lot of content into near worthless commodity. Getting and maintaining attention is the tricky part, and once a business has that, then the benefits can flow through to the bottom line.

Some content is valuable, of course. Producing valuable content can earn attention. The content that gets the most attention is typically something for which an audience has a strong need, yet can’t easily get elsewhere, and is published in a place they’re likely to see. Or someone they know is likely to see. An article on title tags will likely get buried. An article on the secret code to cracking Google’s Hummingbird algorithms will likely crash your server.

Up until the point everyone else has worked out how to crack them, too, of course.

What Content Does The User Want?

Content can become King if the audience bestows favor upon it. Content producers need to figure out what content the audience wants. Perversely, Google have chosen to make this task even more difficult than it was before by withholding keyword data. Between Google’s supposed “privacy” drive, Hummingbird supposedly using semantic analysis, and Penguin/Panda supposedly using engagement metrics, page level and path level optimization are worth focusing upon going forward.

If you haven’t done one for a while, now is probably a good time to take stock and undertake a content audit.

You Have Valuable Historical Information

If you’ve got historical keyword data, archive it now. It will give you an advantage over those who follow you from this point on. Going forward, it will be much more expensive to acquire this data.

Run an audit on your existing content. What content works best? What type of content is it? Video? Text? What’s the content about? What keywords did people use to find it previously? Match content against your historical keyword data.

Here’s a useful list of site and content audit tools and resources.

If keywords can no longer suggest content demand, then how do we know what the visitor wants in terms of content? We must seek to understand the audience at a deeper level. Take a more fuzzy approach.

Watch Activity Signals

Analytics can get pretty addictive and many tools let you watch what visitors do in real time. Monitor engagement levels on your pages. What is a user doing on that page? Are they reading? Contributing? Clicking back and forward looking for something else?

Ensure pages with high engagement are featured prominently in your information architecture. Relegate or fix low-engagement pages. Segment out your content so you know which is the most popular, in terms of landings, and link that information back to ranking reports. This way, you can approximate keywords and stay focused on the content users find most relevant and engaging. Segment out your audience, too. Different visitors respond to different things. Do you know which group favours what? What do older people go for? What do younger people go for? Here are a few ideas on how to segment users.

User behavior is getting increasingly complex. It takes multiple visits to purchase, from multiple channels/influences. Hence the addition of user segmentation allows us to focus on people. (For these exact reasons multi-channel funnels analysis and attribution modeling are so important!)
At the moment in web analytics solutions, people are defined by the first party cookie stored on their browser. Less than ideal, but 100x better then what we had previously. Over-time as we all expand to Universal Analytics perhaps we will have more options to track the same person, after explicitly asking for permission, across browsers, channels and devices

In-Site Search

If Google won’t give you keywords, build your own keyword database. Think about ways you can encourage people to use your in-site search. Watch the content they search for and consume the most. Another way of looking at site search is to provide navigation links that emphasize different keywords terms. For example, you could place these high up on your page, with each offering a different option relating to related keyword terms. Take a note of which keyword terms visitors favour over others.

In the good old days, people dutifully used site navigation at the left, right, or top of a website. But, two websites have fundamentally altered how we navigate the web: Amazon, because the site is so big, sells so many things, and is so complicated that many of us go directly to the site search box on arrival. And Google, which has trained us to show up, type what we want, and hit the search button. Now when people show up at a website, many of them ignore our lovingly crafted navigational elements and jump to the site search box. The increased use of site search as a core navigation method makes it very important to understand the data that site search generates

Distribution

Where does attention flow from? Social media? A mention is great, but if no attention flows over that link to your content, then it might be a misleading metric. Are people sharing your content? What topics and content gets shared the most?

Again, this comes back to understanding the audience, both what they’re talking about and what actions they take as a result. In “Digital Marketing Analytics: Making Sense Of Consumer Data”the authors recommend creating a “learning agenda”. Rather than just looking for mentions and volume of mentions, focus on specific brand or service attributes. Think about the specific questions you want answered by visitors as if they those visitors were sitting in front of you.

For example, how are consumers reacting to prices in your niche? What are their complaints? What do they wish would happen? Are people talking negatively about something? Are they talking positively about something? Who are the new competitors in this space?

Those are pretty rich signals. We can then link this back to content by addressing those issues within our content.

Categories: 

Google Keyword (Not Provided)

Just a follow up on the prior (not provided) post, as Google has shot the moon since our last post on this. Here’s a quick YouTube video.

The above video references the following:

Matt Cutts when secured search first rolled out:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.

This Week in Google (TWIG) show 211, where Matt mentioned the inspiration for encrypted search:

we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said “we’ve got to encrypt the web.”

The integration of organic search performance data inside AdWords.

When asked about the recent increase in (not provided), a Google representative stated the following:

We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….

The motivation here is not to drive the ads side — it’s for our search users.

What an excellent time for Google to block paid search referrals as well.

If the move is important for user safety then it should apply to the ads as well.

Categories: 

Design Thinking

One of the problems with analysing data is the potential to get trapped in the past, when we could be imagining the future. Past performance can be no indication of future success, especially when it comes to Google’s shifting whims.

We see problems, we devise a solution. But projecting forward by measuring the past, and coming up with “the best solution” may lead to missing some obvious opportunities.

Design Thinking

In 1972, psychologist, architect and design researcher Bryan Lawson created an empirical study to understand the difference between problem-based solvers and solution-based solvers. He took two groups of students – final year students in architecture and post-graduate science students – and asked them to create one-story structures from a set of colored blocks. The perimeter of the building was to optimize either the red or the blue color, however, there were unspecified rules governing the placement and relationship of some of the blocks.
Lawson found that:

The scientists adopted a technique of trying out a series of designs which used as many different blocks and combinations of blocks as possible as quickly as possible. Thus they tried to maximize the information available to them about the allowed combinations. If they could discover the rule governing which combinations of blocks were allowed they could then search for an arrangement which would optimize the required color around the design. By contrast, the architects selected their blocks in order to achieve the appropriately colored perimeter. If this proved not to be an acceptable combination, then the next most favorably colored block combination would be substituted and so on until an acceptable solution was discovered.

Nigel Cross concludes from Lawson’s studies that “scientific problem solving is done by analysis, while designers problem solve through synthesis”

Design thinking tends to start with the solution, rather than the problem. A lot of problem based-thinking focuses on finding the one correct solution to a problem, whereas design thinking tends to offer a variety of solutions around a common theme. It’s a different mindset.

One of the criticisms of Google, made by Google’s former design leader Douglas Bowman, was that Google were too data centric in their decision making:

When a company is filled with engineers, it turns to engineering to solve problems. Reduce each decision to a simple logic problem. Remove all subjectivity and just look at the data…that data eventually becomes a crutch for every decision, paralyzing the company and preventing it from making any daring design decisions…

There’s nothing wrong with being data-driven, of course. It’s essential. However, if companies only think in those terms, then they may be missing opportunities. If we imagine “what could be”, rather than looking at “what was”, opportunities present themselves. Google realise this, too, which is why they have Google X, a division devoted to imagining the future.

What search terms might people use that don’t necessarily show up on keyword mining tools? What search terms will people use six months from now in our vertical? Will customers contact us more often if we target them this way, rather than that way? Does our copy connect with our customers, of just search engines? Given Google is withholding more search referral data, which is making it harder to target keywords, adding some design thinking to the mix, if you don’t already, might prove useful.

Tools For Design Thinking

In the book, Designing For Growth, authors Jeanne Liedtka and Tim Ogilvie outline some tools for thinking about opportunities and business in ways that aren’t data-driven. One famous proponent of the intuitive, design-led approach was, of course, Steve Jobs.

It’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them

The iphone or iPad couldn’t have been designed by looking solely at the past. They mostly came about because Jobs had an innate understanding of what people wanted. He was proven right by the resulting sales volume.

Design starts with empathy. It forces you to put yourself in the customers shoes. It means identifying real people with real problems.

In order to do this, we need to put past data aside and watch people, listen to people, and talk with people. The simple act of doing this is a rich source of keyword and business ideas because people often frame a problem in ways you may not expect.

For example, a lot of people see stopping smoking as a goal-setting issue, like a fitness regime, rather than a medical issue. Advertising copy based around medical terminology and keywords might not work as well as copy oriented around goal setting and achieving physical fitness. This shift in the frame of reference certainly conjures up an entirely different world of ad copy, and possibly keywords, too. That different frame might be difficult to determine from analytics and keyword trends alone, but might be relatively easy to spot simply by talking to potential customers.

Four Questions

Designing For Growth is worth a read if you’re feeling bogged down in data and looking for new ways to tackle problems and develop new opportunities. I don’t think there’s anything particularly new in it, and it can come across as “the shiny new buzzword” at times, but the fundamental ideas are strong. I think there is value in applying some of these ideas directly to current SEO issues.

Designing For Growth recommends asking the following questions.

What is?

What is the current reality? What is the problem your customers are trying to solve? Xerox solved a problem customers didn’t even know that had when Xerox invented the fax machine. Same goes for the Polaroid camera. And the microwave oven. Customers probably couldn’t describe those things until they saw and understood them, but the problem would have been evident had someone looked closely at the problems they faced i.e. people really wanted faster, easier ways of completing common tasks.

What do your customers most dislike about the current state of affairs? About your industry? How often do you ask them?

One way of representing this information is with a flowchart. Map the current user experience from when they have a problem, to imagining keywords, to searching, to seeing the results, to clicking on one of those results, to finding your site, interacting to your site, to taking desired action. Could any of the results or steps be better?

Usability tests use the same method. It’s good to watch actual customers as they do this, if possible. Conduct a few interviews. Ask questions. Listen to the language people use. We can glean some of this information from data mining, but there’s a lot more we can get by direct observation, especially when people don’t click on something, as non-activity seldom registers in a meaningful way in analytics.

What if?

What would “something better” look like?

Rather than think in terms of what is practical and the constraints that might prevent you from doing something, imagine what an ideal solution would look like if it weren’t for those practicalities and constraints.

Perhaps draw pictures. Make mock-ups. Tell a story. Anything that fires the imagination. Use emotion. Intuition. Feeling. Just going through such a process will lead to making connections that are difficult to make by staring at a spreadsheet.

A lot of usability testers create personas. These are fictional characters based on real or potential customers and are used try to gain an understanding of what they might search for, what problems they are trying to solve, and what they expect to see on our site. Is this persona a busy person? Well educated? Do they use the internet a lot? Are they buying for themselves, or on behalf of others? Do they tend to react emotionally, or are they logical? What incentives would this persona respond to?

Personas tend to work best when they’re based on actual people. Watch and observe. Read up on relevant case studies. Trawl back through your emails from customers. Make use of story-boards to capture their potential actions and thoughts. Stories are great ways to understand motivations and thoughts.

What are those things your competition does, and how could they be better? What would those things look like in the best possible world, a world free of constraints?

What wows?

“What wows” is especially important for social media and SEO going forward.

Consider Matt Cutts statement about frogs:

Those other sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank.
Google would seek to detect that there is no real differentiation between these results and show only one of them so we could offer users different types of sites in the other search results

Cutts talks about the creation of new value. If one site is saying pretty much the same as another site, then those sites may not be duplicates, but one is not adding much in the way of value, either. The new site may be relegated simply for being “too samey”.

It’s the opposite of the Zygna path:

“I don’t fucking want innovation,” an anonymous ex-employee recalls Pincus saying in 2010, according to the SF Weekly. “You’re not smarter than your competitor. Just copy what they do and do it until you get their numbers.”

Generally speaking, up-and-coming sites should focus on wowing their audience with added depth and/or a new perspective. This, in turn, means having something worth remarking upon, which then attracts mentions across social media, and generates more links.

Is this certain to happen? Nothing is certain as far as Google is concerned. They could still bury you on a whim, but wowing an audience is a better bet than simply imitating long-established players using similar content and link structures. At some point, those long-established players had to wow their audience to get the attention and rankings they enjoy today. They did something remarkably different at some point. Instead of digging the same hole deeper, dig a new hole.

In SEO, change tends to be experimental. It’s iterative. We’re not quite sure what works ahead of time, and no amount of measuring the past tells us all we want to know, but we try a few things and see what works. If a site is not ranking well, we try something else, until it does.

Which leads us to….

What works?

Do searchers go for it? Do they do that thing we want them to do, which is click on an ad, or sign up, or buy something?

SEOs are pretty accomplished at this step. Experimentation in areas that are difficult to quantify – the algorithms – have been an intrinsic part of SEO.

The tricky part is not all things work the same everywhere & much like modern health pathologies, Google has clever delays in their algorithms:

Many modern public health pathologies – obesity, substance abuse, smoking – share a common trait: the people affected by them are failing to manage something whose cause and effect are separated by a huge amount of time and space. If every drag on a cigarette brought up a tumour, it would be much harder to start smoking and much easier to quit.

One site’s rankings are more stable because another person can’t get around the sandbox or their links get them penalized. The same strategy and those same links might work great for another site.

Changes in user behavior are more directly & immediately measurable than SEO.

Consider using change experiments as an opportunity to open up a conversation with potential users. “Do you like our changes? Tell us”. Perhaps use a prompt asking people to initiate a chat, or participate on a poll. Engagement that has many benefits. It will likely prevent a fast click back, you get to see the words people use and how they frame their problems, and you learn more about them. You become more responsive and empathetic sympathetic to their needs.

Beyond Design Thinking

There’s more detail to design thinking, but, really, it’s mostly just common sense. Another framework to add, especially if you feel you’re getting stuck in faceless data.

Design thinking is not a panacea. It is a process, just as Six Sigma is a process. Both have their place in the modern enterprise. The quest for efficiency hasn’t gone away and in fact, in our economically straitened times, it’s sensible to search for ever more rigorous savings anywhere you can

What’s best about it, I feel, is this type of thinking helps break strategy and data problems down and give it a human face.

In this world, designers can continue to create extraordinary value. They are the people who have, or could have, the laterality needed to solve problems, the sensing skills needed to hear what the world wants, and the databases required to build for the long haul and the big trajectories. Designers can be definers, making the world more intelligible, more habitable

Categories: 

Jim Boykin Interview

Jim Boykin has been a longtime friend & was one of the early SEOs who was ahead of the game back in the day. While many people have came and went, Jim remains as relevant as ever today. We interviewed him about SEO, including scaling his company, disavow & how Google has changed the landscape over the past couple years.

Aaron: How did you get into the field of SEO?

Jim: In 1999 I started We Build Pages as a one man show designing and marketing websites…I never really became much of a designer, but luckily I had much more success in the marketing side. Somehow that little one man show grew to about 100 ninjas, and includes some communities and forums I grew up on (WebmasterWorld, SEOChat, Cre8asiteForums), and I get to work with people like Kris Jones, Ann Smarty, Chris Boggs, Joe Hall, Kim Krause Berg, and so many others at Ninjas who aren’t as famous but are just as valuable to me, and Ninjas has really become a family over the years. I still wonder at times how this all happened, but I feel lucky with where we’re at.

Aaron: When I got started in SEO some folks considered all link building to be spam. I looked at what worked, and it appeared to be link building. Whenever I thought I came up with a new clever way to hound for links &amp; would hunted around, most the times it seems you got there first. Who were some of the people you looked to for ideas when you first got into SEO?

Jim: Well, I remember going to my first SEO conference in 2002 and meeting people like Danny Sullivan, Jill Whalen, and Bruce Clay. I also remember Bob Massa being the first person “dinged” by google for selling links…that was back in 2002 I think…I grew up on Webmasterworld and I learned a ton from the people in there like: Tedster, Todd Friesen, Greg Boser, Brett Tabke, Shak, Bill, Rae Hoffman, Roger Montti, and so many others in there over the years…they were some of my first influencers….I also used to hang around with Morgan Carey, and Patrick Gavin a lot too. Then this guy selling an SEO Book kept showing up on all my high PR pages where I was getting my links….hehe…

Aaron: One of the phrases in search that engineers may use is “in an ideal world…”. There is always some amount of gap between what is advocated & what actually works. With all the algorithmic changes that have happened in the past few years, how would you describe that “gap” between what works & what is advocated?

Jim: I feel there’s really been a tipping point with the Google Penguin updates. Maybe it should be “What works best short term” and “What works best long term”….anything that is not natural may work great in the short term, but your odds of getting zinged by Google go way up. If you’re doing “natural things” to get citations and links, then it may tend to take a bit longer to see results (in conjunction with all you’re doing), but at least you can sleep at night doing natural things (and not worrying about Google Penalties).  It’s not like years ago when getting exact targeted anchor text for the phrases you want to rank on was the way to go if you wanted to compete for search rankings. Today it’s much more involved to send natural signals to a clients website.  To send in natural signals you must do things like work up the brand signals, trusted citations, return visitors, good user experience, community, authors, social, yada yada….SEO is becming less a “link thing”…and more a “great signals from many trusted people”, as well as it’s a branding game now. I really like how SEO is evolving….for years Google used to say things like “Think of the users” when talking of the algorthym, but we all laughed and said “Yea, yea, we all know that it’s all about the Backlinks”….but today, I think Google has crossed a tipping point where yes, to do great SEO, you must focus on the users, and not the links….the best SEO is getting as many citations and trusted signals to your site than your competitors…and there’s a lot of trusted signals which we, as internet marketers, can be working on….it’s more complicated, and some SEO’s won’t survive this game…they’ll continue to aim for short term gains on short tail keyword phrases…and they’ll do things in bulk….and their network will be filtered, and possibly penalized.

Every website owner has to measure the risks, and the time involved, and the expected ROI….it’s not a cheap game any more….doing real marketing involves brains and not buttons…if you can’t invest in really building something “special” (ideally many special things), on your site to get signals (links/social), then you’re going to find it pretty hard to get links that look natural and don’t run a risk of getting penalized.  The SEO game has really matured, the other option is to take a high risk of penalization.

Aaron: In terms of disavow, how deep does one has to cut there?

Jim: as deep as it needs to be to remove every unantural link. If you have 1000 backlinks and 900 are on pages that were created for “unnatural purposes (to give links)” then all 900 have to be disavowed…if you have 1000 backlinks, and only 100 are not “natural” then only 100 need to be disavowed… what percent has to be disavowed to untrip an algorthymitic filter? I’m not sure…but almost always the links which I disavow have zero value (in my opinion) anyways.  Rip the band-aid off, get over it, take your marketing department and start doing real things to attract attention, and to keep it.

Aaron: In terms of recoveries, are most penalized sites “recoverable”? What does the typical recovery period look like in terms of duration & restoration?

Jim: oh…this is a bee’s nest you’re asking me….. are sites recoverable….yes, most….if a site has 1000 domains that link to it, and 900 of those are artificial and I disavow them, there might not be much of a recovery depending on what that 100 links left are….ie, if I disavow all link text of “green widgets” that goes to your site, and you used to rank #1 for “green widgets” prior to being hit by a Penguin update, then I wouldn’t expect to “recover” on the first page for that phrase….. where you recover seems to depend on “what do you have for natural links that are left after the disavow?”….the time period….well…. we’ve seen some partial recoveries in as soon as 1 month, and some 3 months after the disavow…and some we’re still waiting on….

To explain, Google says that when you add links to the disavow document, then way it works is that the next time Google crawls any page that links to you, they will assign a “no follow” to the link at that time…..so you have to wait until enough of the links have been recrawled, and now assigned the no follow, to untrip the filter….but one of the big problems I see is that many of the pages Google shows as linking to you, well, they’re not cached in Google!….I see some really spammy pages where Google was there (they record your link), but it’s like Google has tossed the page out of the index even though they show the page as linking to you…so I have to ask myself, when will Google return to those pages?…will Google ever return to those pages???  It looks like if  you had a ton of backlinks that were on pages that were so bad in the eyes of Google that they don’t even show those pages in their index anymore…we might be waiting a long long time for google to return to those pages to crawl them again….unless you do something to get Google to go back to those pages sooner (I won’t elaborate on that one).

Aaron: I notice you launched a link disavow tool & earlier tonight you were showing me a few other cool private tools you have for working on disavow analysis, are you going to make any of those other tools live to the public?

Jim: Well, we have about 12 internal private disavow analysis tools, and only 1 public disavow tool….we are looking to have a few more public tools for analyzing links for disavow analysis in the coming weeks, and in a few months we’ll release our Ultimate Disavow Tool…but for the moment, they’re not ready for the public, some of those are fairly expensive to run and very database intensive…but I’m pretty sure I’m looking at more link patterns than anyone else in the world when I’m analyzing backlinks for doing disavows. When I’m tired of doing disavows maybe I’ll sell access to some of these.

Aaron: Do you see Google folding in the aggregate disavow data at some point? How might they use it?

Jim: um…..I guess if 50,000 disavow documents have spammywebsite.com listed in their disavows, then Google could consider that spammywebsite.com might be a spammy website…..but then again, with people disavowing links who don’t know what they’re doing, I’m sure their’s a ton of great sites getting listed in Disavow documents in Webmaster Tools.

Aaron: When approaching link building after recovering from a penalty, how does the approach differ from link building for a site that has never been penalized?

Jim: it doesn’t really matter….unless you were getting unnatural/artificial links or things in bulk in the past, then, yes, you have to stop doing that now…that game is over if you’ve been hit…that game is over even if you haven’t been hit….Stop doing the artificial link building stuff. Get real citations from real people (and often “by accident”) and you should be ok.

Aaron: You mentioned “natural” links. Recently Google has hinted that infographics, press releases & other sorts of links should use nofollow by default. Does Google aim to take some “natural” link sources off the table after they are widely used? Or are those links they never really wanted to count anyhow (and perhaps sometimes didn’t) & they are just now reflecting that.

Jim: I think ~most of these didn’t count for years anyways….but it’s been impossible for Google to nail every directory, or every article syndication site, or every Press Release site, or everything that people can do in bulk..and it’s harder to get all occurances of widgets and mentions of infographics…so it’s probably just a “Google Scare….ie, Google says, “Don’t do it, No Follow them” (and I think they say that because it often works), and the less of a pattern there is, the harder for Google to catch it (ie, widgets and infographics) …I think too much of any 1 thing (be it a “type of link”) can be a bad signal….as well as things like “too many links from pages that get no traffic”, or “no clicks from links to your site”. In most cases, because of keyword abuse, Google doesn’t want to count them…links like this may be fine (and ok to follow) in moderation…but if you have 1000 widgets links, and they all have commercial keywords as link text, then you’re treading on what could certainly turn into a negative signal, and so then you might want to consider no following those.

Aaron: There is a bit of a paradox in terms of scaling effective quality SEO services for clients while doing things that are not seen as scalable (and thus future friendly & effective). Can you discuss some of the biggest challenges you faced when scaling IMN? How were you able to scale to your current size without watering things down the way that most larger SEO companies do?

Jim: Scaling and keep quality has certainly been a challenge in the past. I know that scaling content was an issue for us for a while….how can you scale quality content?….Well, we’ve found that by connecting real people, the real writers, the people with real social influence…and by taking these people and connecting them to the brands we work with…..so these real people then become “Brand Evangelist”…and getting these real people who know what they’re talking about to then write for our clients, well, when we did that we found that we could scale the content issue. We can scale things like link building by merging with the other “mentions”, and specifically targeting industries and people and working on building up associations and relations with others has helped to scale…plus we’re always building tools to help us scale while keeping quality. It’s always a challenge, but we’ve been pretty good at solving many of those issues.

I think we’ve been really good at scaling in house….many content marketers are now more like community managers and content managers….we’ve been close to 100 employees for a few years now..so it’s more how can we do more with the existing people we have…and we’ve been able to do that by connecting real people to the clients so we can actually have better content and better marketing around that content….I’m really happy that the # of employees has been roughly the same for past few years, but we’re doing more business, and the quality keeps getting better….there’s not as many content marketers today as there was a few years ago, but there’s many more people working on helping authors build up their authorship value and produce more “great marketing” campaigns where as a bi-product, we happen to get some links and social citations.

Aaron: One of the things I noticed with your site over the past couple years is the sales copy has promoted the fusion of branding and SEO. I looked at your old site in Archive.org over the years & have seen quite an amazing shift in terms of sales approach. Has Google squeezed out most of the smaller players for good & does effective sustainable SEO typically require working for larger trusted entities? How would you contrast approach/strategy in working with bigger and smaller clients?When I first got into SEO about 80%+ of the hands in the audiences at conferences were smaller independent players. At the last conference I was at it seemed that about 80% of the hands in the audience worked for big companies (or provided services to big companies). Is this shift in the market irreversible? How would you compare/contrast approach in working with smaller & larger clients?

Jim: Today it’s down to “Who really can afford to invest in their Brand?” and “Who can do real things to get real citations from the web?”….and who can think way beyond “links”…if you can’t do those things, then you can’t have an effective sustainable online marketing program…. we once were a “link building company” for many, many years…. but for the past 3 years we’ve moved into full service, offering way more than what was “link building services”…. yea, SEO was about “links” for years, and it still is to a large degree….but unless you want to get penalized, you have to take the “it’s way more than links” approach… in order for SEO to work (w/o fear of getting penalized) today, you have to look at sending in natural signals…so thus, you must do “natural” things…things that will get others “talking” about it, and about you….SEO has evolved a lot over the years….Google used to recommend 1 thing (create a great site and create great things), but for years we all knew that SEO was about links and anchor text….today, …today, I think Google has caught up with (to some degree) with the user, and with “real signals”…yesterday is was “gaming” the system….today it’s about doing real things…real marketing…and getting you name out to the community via creating great things that spread, and that get people to come back to your site….those SEO’s and businesses who don’t realize that the game has changed, will probably be doing a lot of disavowing at some time in the future, and many SEO’s will be out of business if they think it’s a game where you can do “fake things” to “get links” in bulk….in a few years we’ll see who’s still around for internet marketing companies…those who are still around will be those who do real marketing using real people and promoting to other real people…the link game itself has changes…in the past we looked a link graphs…today we look at people graphs….who is talking about you, what are they saying….it’s way more than “who links to me, and how do they link to me”….Google is turning it into a “everyone gets a vote”, and “everyone has a value”…and in order to rank, you’ll need real people of value talking about your site…and you’ll need a great user experience when they get there, and you’ll need loyal people who continue to return to your site, and you’ll need to continue to do great things that get mentions….

SEO is no longer a game of some linking algorithm, it’s now really a game of “how can you create a great user experience and get a buzz around your pages and brand”.

Aaron: With as much as SEO has changed over the years, it is easy to get tripped up at some point, particularly if one is primarily focused on the short term. One of the more impressive bits about you is that I don’t think I’ve ever seen you unhappy. The “I’m feeling lucky” bit seems to be more than just a motto. How do you manage to maintain that worldview no matter what’s changing & how things are going?

Jim: Well, I don’t always feel lucky…I know in 2008 when Google hit a few of our clients because we were buying links for them I didn’t feel lucky (though the day before, when they ranked #1, I felt lucky)….but I’m in this industry for the long term…I’ve been doing this for almost 15 years….and yes, we’ve had to constantly change over the year, and continue to grow, and growing isn’t always easy…but it is exciting to me, and I do feel lucky for what I have…I have a job I love, I get to work with people whom I love, in an industry I love, I get to travel around the world and meet wonderful people and see cool places…and employee 100 people and win “Best Places to work” awards, and I’m able to give back to the community and to society, and to the earth…those things make me feel lucky…SEO has always been like a fun game of chess to me…I’m trying to do the best I can with any move, but I’m also trying to think a few steps ahead, and trying to think what Google is thinking on the other side of the table…..ok…yea, I do feel lucky….maybe it’s the old hippy in me…I always see the glass half full, and I’m always dreaming of a better tomorrow….

If I can have lots of happy clients, and happy employees, and do things to make the world a little better along the way, then I’m happy…sometimes I’m a little stressed, but that comes with life….in the end, there’s nothing I’d rather be doing than what I currently do….and I always have big dreams of tomorrow that always make the trials of today seem worth it for the goals of what I want to achieve for tomorrow.

Aaron: Thanks Jim!


Jim Boykin is the CEO of the Internet Marketing Ninjas company, and a Blogger and public speaker. You can find Jim on Twitter, Facebook, and Google Plus.

Categories: