Retailers Promote Over 1 Billion Products With Google PLAs, Holiday Surge Expected [Marin]

Today, Marin Software release a new study that finds Google Product Listing Ads (PLAs) continue to see increased adoption among retailers. According to the study, over one billion products were being promoted through PLAs as of May 2013. PLA impression…

Global Issues Shift Global SEO & SEM Budgets Even Faster To Asia

It’s always interesting to contemplate how global events affect us, even in our own back yards — and turbulent times have definitely impacted online marketing budgets. An analysis of several hundred clients with online marketing initiatives spanning over 100 regions around the world…

Please visit Search Engine Land for the full article.

9 Things We Should Never Stop Doing in Link Building

A couple of months ago, I went on a little rant. (It happens sometimes; I’m looking into it.) I was overwhelmed by the responses that little column generated, with others picking up my rally cry to eliminate super shady link building practices. But commenters were also quick to point out that…

Please visit Search Engine Land for the full article.

Live @ SMX East: What Are The Most Important Search Ranking Factors?

Google tells us they make as many as 600 changes to their ranking algorithms every year. Some are major, typically named after disarmingly cute animals that can nonetheless severely damage or even destroy hard-won search result placement. Other changes…

Google Dwarfs Bing & Yahoo As Traffic Source For Major News Sites

Google is said to have about 65-70 percent market share of searches in the U.S., but for many publishers, Google’s share of incoming search traffic is much higher. That’s certainly the case with major news sites like Reuters, Mashable, Dallas Morning News, The Next Web and others that…

Please visit Search Engine Land for the full article.

SEO in 2013: Do’s and Don’t Do’s

seo is my job memeDon’t be stuck in your ways when it comes to SEO tactics. Read this post to learn those tactics you should drop, and those that your should be embracing.

Post from on State of Search
SEO in 2013: Do’s and Don’t Do’s

What SEO Bloggers Should and Should Not Share

There are too many of us out here blogging about search engine optimization, social media optimization, conversion rate optimization, optimization optimization, and cute gerbils on hang gliders. It’s become a huge problem because, frankly, people just do not pay attention to what is happening. Here is some tough love for all you SEO bloggers. If You Share the Emails You Send Out… I just read a great “how to” blog post that provides explicit details on the kinds of outreach emails that work and when to use them. There were not that many comments on the article but they were all pretty much in the “great post! can’t wait to destroy the value in this idea” category. You know, if your agency has developed a formula for reaching out to bloggers for links, the dumbest thing you can do is tell the rest of the SEO world what works and what doesn’t work. Your “loyal readers”, the people whom you are trying to impress, will take your email templates and use them and burn them so that in a few months everyone who blogs about anything from knitting needles to cat pictures will become sick and tired of seeing these […]

The Benefits Of Thinking Like Google

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google’s latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines.” I wrote on this topic, too.

Those with a vested interest in the link building industry – which is pretty much all of us – might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:”any link you build for the purpose of manipulating rank is outside the guidelines.” Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google’s guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric’s assertion “I applaud Google for being more and more transparent with their guidelines”, Ralph writes- “man, Eric: isn’t the whole point of your piece that this is exactly what they’re NOT doing, becoming “more transparent”?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I’m not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that “artificially” enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” – links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don’t spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are “bad”?

The answer to this problem, from Google’s John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What’s going to be the emphasis in six months time? Next year? How do you know what you’re doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years it would be unthinkable webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year’s unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don’t have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same – no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does – great – until next time, that is.

What’s your long term “traffic from Google” strategy?

If all you do is “follow Google’s Guidelines”, I’d say that’s now a high risk SEO strategy.

Categories: 

Relationships between Search Entities

When I talk about, or write about entities, it’s normally in the context of specific people, places, or things. Google was granted a patent recently which discusses a different type of entity, in a more narrow manner. These entities are referred to as “search entities”, and the patent uses them to predict probabilities and understand […]

The post Relationships between Search Entities appeared first on SEO by the Sea.

Comparing the Google+ and Google Places Page Management Interfaces

Posted by David Mihm

Caveat: I am definitely not a professional interface designer; this task I leave largely to the experts on our UX & Design team. My goal behind this post is to increase usage of Places for Business, however, and raise the visibility of that destination among the small-business-focused marketing community.

Setting aside the difficulty that Google had integrating Zagat into its product mix, its own branding difficulties in the Local space have been well-chronicled. Following the zigzag from Local to Maps to Places to Places-with-Hotspot, back to just Places, then to Plus-Local, and (finally?) plain ol’ Plus has been like observing a misguided exercise in corporate alligator escapism.

Although the result of this hodgepodge of brands appears largely the same to consumers, who probably weren’t all that keyed into the evolution anyway, Google’s ill-defined brand in Local has almost certainly been a contributing factor to its deficit in business owner engagement relative to Facebook.

It’s just not clear to the average brick-and-mortar business owner, let alone the average SEO, where she should go to get started at Google. While Google’s “first responders” in the support forums have been darned consistent in their mantra of using Places for Business to manage this presence, this destination gets very little love in Google’s mainstream advertising — or even AdWords. It’s impossible to get to from Google’s primary business-oriented pages, and a number of searches (including “Google Plus Local Page“) return this answer.

Which is a shame, because the Plus management interface offers a vastly inferior experience for business owners. Although I recommended it last year, here’s why I no longer encourage business owners (or SEOs) to use it, and why I’ve come around to places.google.com.

The deficiencies of the Google+ page management interface

1. No UI hierarchy

This interface is a jumble of Pinterest-like modules, with none more or less important than the others. If I were to answer my own question (“What am I supposed to look at?”), my natural inclination would seem to be the big green box in the middle — “Start a video call with your followers.” Hardly something the average business owner is going to have time for or get any value out of.

Meanwhile, attributes that are core to a business’s success (categories, hours, location information) are hidden behind a white-on-white button, and my natural primary activity (posting as my business rather than as myself) is easy to miss when juxtaposed alongside the “green monster.” It’s no wonder that even LinkedIn beats Google+ for social sharing.

2. Mis-targeting the average SMB

The eager-beaver SMBs who explore the navigation beyond the first page are likely to find themselves pretty lost. They’re asked to install plugins, buttons, and even connect to the Google APIs console (while being consoled that it’s only a 3-step process). Something like 50% of this audience doesn’t even have a website, and 90% doesn’t even have a mobile website, for goodness sake.

3. Slightly misleading insights

The Places dashboard hasn’t exactly been a paragon of useful information, but my main complaint with this tab is presentation, rather than data. There’s actually quite a bit of useful information here, but unfortunately it’s hidden in the default view. “Actions” and “Views” are presented flatly, where a view of a post is treated with the same importance as a click for driving directions or into a business’s website. So a business is likely to miss out on what are actually some pretty important metrics, or at least see some inflated numbers.

4. No help

The only way to get help with this far-from-simple product is to click first into settings, and then into “Learn More” on the section that you’re interested in.

The strengths of the Places management interface

1. Extremely clear messaging

Strong calls to action pop right off the page here: the green-backgrounded “Complete your business information,” the blue-backgrounded “Edit information,” and even the boringness of the grayed-out “Add photo” area all point directly to what Google and the SMB are both trying to accomplish with this product.

2. Perfect targeting of the average SMB

It’s evident that the designers of the Places Dashboard have spent plenty of time watching business owners using their product. Clicking the question mark just once brings up tooltips alongside all the major sections of the tool. Not only does this decrease the number of questions Google is likely to receive from business owners, but it answers those questions in a clear, friendly tone that gives less-sophisticated owners a great first impression of Google’s products.

3. Clear(er) insights

This simplistic interface is very transparent about the data it’s showing (number of times this listing appeared in a local search result), and presents a much more representative view of a business’s presence at Google (my page only has 3 actions) without overcomplicating the situation for the business owner.

4. Terrific tooltips and inline help text

Here’s where the experience of the Places team really shines through: They don’t take any pre-existing knowledge of how business listings work for granted, walking the business owner through every step of the page-creation process.

5. Phone support (!)

And of course, if a business owner isn’t able to figure things out on their own, there are plenty of relevant links directly to the most-commonly asked questions, and the process highlights Google’s revolutionary option of phone support.

Conclusion

Given how much effort has been put into the Local Business Center / Places for Business Dashboard over the last several years — and the extremely polished result those efforts have yielded — I’m surprised Google continues to throw any energy into promoting the Plus management option to small businesses, let alone developing and maintaining it.

Any business owner who visits Plus should be sent right over to the Places for Business Dashboard. It seems to be much more empathetic to the typical business owner’s level of sophistication, and solves their most important needs more directly than Plus.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Winning Strategies to Lose Money With Infographics

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment – until your company goes under.

Some will ascribe such advice as taking a “sustainable” and “low-risk” approach, but such strategies are only “sustainable” and “low-risk” so long as ROI doesn’t matter & you are spending someone else’s money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don’t get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball’s chance in hell.

An infographic created around “the 10 best ways you can give me your money” won’t spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google’s recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter’s internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 “spam science” programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link – and they can do so with zero risk.

@phillian Like all empires, ultimately Google will be the root of its own demise.— Cygnus SEO (@CygnusSEO) August 13, 2013

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can’t help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here’s an infographic for your pleasurings.

Categories: