Google’s Matt Cutts: Our Algorithms Try To Break Black Hat SEOs Spirits
A couple weeks ago, Google’s Matt Cutts was on This Week in Google (TWiG) and on episode 227 Matt had some interesting things to say. He said that Google specifically tries to break the spirits of black hat SEOs…
Big Data Marketing’s Next Frontier: Paid Search
PLAs and other paid search offer an excellent opportunity to capture more customers. Make all your time and money investments pay off by using a data-driven approach to deliver a premium onsite experience that matches the intent of your prospects.
Stolen Moose Heads: An Interview with innocent
In the last few weeks I’ve been putting together a new guide for our Distilled audience: Finding Your Brand’s Voice. It offers practical and detailed advice on how a company can shape its own tone of voice in order to project a unique and consistent persona.
Report: Google PLAs Deliver 4X Revenue Lift For Retailers In Early Holiday Season
It’s probably not too surprising to hear that retailers have doubled their year-over-year spend on Google product listing ads this holiday season thus far. After all, the paid format of Google Shopping had just rolled out in the fall of 2012. What is stunning, however, is the four-fold…
Please visit Search Engine Land for the full article.
A Look Back On One Crazy Year Of Link Building
If 2012 was the year of Google Algorithm updates — Moz counted 37 big ones compared to the 15 in 2013 and 21 in 2011 — 2013 was the year that link building suffered from a serious identity crisis. It was sidelined, stretched, swindled and spit back out again more times than your average…
Please visit Search Engine Land for the full article.
Improving URL removals on third-party sites
Webmaster level: allContent on the Internet changes or disappears, and occasionally it’s helpful to have search results for it updated quickly. Today we launched our improved public URL removal tool to make it easier to request updates based on changes…
Hit By Panda and Confused About Low-Quality Content? Run This Google Analytics Report Now
Getting hit by Google Panda can be confusing for many webmasters. But one important Google Analytics report can help Panda victims get on the right track, and quickly. The post includes detailed instructions for creating and exporting the report.
4 Trends Marketers Must Address In 2014: Audience, Relevance, Social & Mobile
Upon revisiting last year’s post on this topic, a couple of our bold predictions flopped and a couple turned out to be more reality than fantasy. Though we didn’t see Apple monetize the search results served up by Siri, we did see ad formats evolve to increase relevance and user…
Please visit Search Engine Land for the full article.
Task Management Tools for Digital Marketers
5 of the best online task management tools for digital marketers
Post from Matt Beswick on State of Digital
Task Management Tools for Digital Marketers
Google Glass XE12 Adds iOS, Wink & Breaks Many GDK Apps
XE12 is out and with that comes support for the whole other side of the mobile world, iPhones. Google released the MyGlass App for iOS and quickly pulled it. Why…
Google Zeitgeist Live With Inspiring Video
I like to look back at things, when I have time. Google has published their 2013 Zeitgeist at google.com/zeitgeist.
It is obviously a nice way to look at all the keywords you may have missed out on. But obviously, you should not be targeting keyword…
8 Tips for Building Your Internal Content Marketing Strategy
It can be difficult to keep your own content marketing goals on track when competing with other business goals. But having a strategy helps. Here’s how to create a content strategy that will power you toward where you need to go.
How to build and run an SEO Company: @kaiserthesage asks @SEO_Hacker
I’ve known Sean Si ever since I started blogging, and he’s one of my closest friends in the industry. We used to work together for a few campaigns (I was part of the SEO-Hacker team when it was just starting).
We’ve also been exchanging a lot of ideas about the practice as well as the business end of SEO for the past 3 years now. So I’m also certain that you guys will learn a lot from him too.
The post How to build and run an SEO Company: @kaiserthesage asks @SEO_Hacker appeared first on Kaiserthesage.
The Pros and Cons of Big Data Democratization
Challenges surrounding data democratization abound. Business leaders need to carefully weigh these arguments for and against conservative and liberal data democratization to determine which approach benefits their organization the most.
Everything you type is recorded. Even if you don’t post.

We collected data from 3.9 million users over 17 days and associate self-censorship behavior with features describing users, their social graph, and the interactions between them. Our results indicate that 71% of users exhibited some level of last-minute self-censorship…
The above is an excerpt from a paper titled “Self Censorship on Facebook” written by Sauvik Das from Carnegie Mellon University and Adam Kramer from Facebook.…
The post Everything you type is recorded. Even if you don’t post. appeared first on DEJAN SEO.
Are Google’s changes improving the search experience?
With so many tweaks, as well as major updates, is Google improving the search experience for the user?
Julia Logan, Irish Wonder:
I think it has become more clear than ever before that Google’s ‘improving the user search experience’ mantra is nothing more than PR talk.
Google is a commercial entity, every step it takes now shows clearly that all it cares about is finding more ways to monetise.
Dr Pete Meyers, Marketing Scientist at Moz:
I think too many of the changes have been reactionary in 2013, and many have been driven by fear of losing revenue.
That’s not necessarily to say that those changes are bad for users, but the rapid pace of change hasn’t always given it time to evaluate or understand how back-to-back changes interact for everyday users.
Will Critchlow, Founder and CMO at Distilled:
I think it’s a bit of both. Things that benefit regular users in my opinion:
- Knowledge graph / cards / one-boxes. Almost all are great for users.
- Improved query understanding benefits most regular users (though power users sometimes miss ‘verbatim as default’).
- Most social results have improved.
- Most fresh result and news-based results are useful.
Things that confuse and / or damage UX for regular users in my opinion:
- The rapidly-disappearing ad labelling. The background colour has been getting lighter and lighter and now it looks like it will be replaced by a small icon. I predict fewer users will be aware when they are clicking on an ad.

- The same applies to paid inclusion in product search. Not knowing if you are clicking on an ad or not is a bad UX.
Although I understand why Google is doing it, I believe that some of the penalties we have seen this year have made the search results worse for regular users. When big brands and sites that should be the right answer manipulate the results and get downgraded as a result, that hurts UX, a trade-off Google appears happy to make in the short term.
Kevin Gibbons, UK MD at Blueglass:
Google always likes to keep us on our toes! It will always be looking to improve the search experience for the user, and that can’t be a bad thing.
I think one of the biggest shifts in mindset in 2013 has been that marketers are aligning their strategies much closer with customers and less with search engines.
Andrew Girdwood, Media Innovations Director at LBi:
Google is improving the search experience. The search engine is better than ever before. Search results are better and the search experience feels more appropriate for a wider number of devices.
I’m not a Google slave. My default search engine in my web browser is Bing (ever since Google Reader; what a great prompt for me to try the competition).
I enjoy the GUI improvements made this year and appreciate Google-as-a-destination when I’m searching for information on my smartphone.
Teddie Cowell, Director of SEO, Mediacom:
There is a lot of change going at the moment, it’s true, but search is undergoing a fairly abrupt metamorphosis from something that was relatively basic in function – dishing up lists of links to things, into something quite amazing which can give individuals contextually relevant information, in more useful forms and via more natural interactions than simply having to type requests into a web browser.
I see Google as leading the charge with the innovation and experimentation around this transformation from a caterpillar to a butterfly.
Yes, some of the ideas may prove flawed and some the experiments fail, but it’s a critical stage in the evolution search so we’ll have to live with the changes for a while yet.
Richard Baxter, CEO at SEOGadget:
Google has got to fiddle. What I like about Google is it genuinely bases the entire process of search updates on what’s best for the user. This classic insight from Danny Sullivan demonstrated just how much each change was scrutinised by the group, with the core success metrics being around improvements to user experience.
That’s just organic search though, and I can’t help but feel that when the other guys get involved (local, paid, video) that something’s not quite right.
Case in point: the search results for car insurance are pretty much entirely paid above the fold, with that huge sponsored quote feature (which isn’t awesome – I’ve used it!).
It’s the same with flights – do the search and you get this box with a list of cities that have routes from London: “272 cities with non-stop flights” – it takes up so much space.
How Google decided this was good for the user I have no idea.

Jimmy McCann, Head of SEO at Search Laboratory:
If you’re doing things the right way you’re going to welcome the fiddling rather than condemn it. Updates and tweaks are the only way for Google to get better and the stuff it’s doing with Schema and other projects is making the search results better.
I certainly don’t agree with the doom and gloom merchant outlook that Google is moving everything toward paid.
2014: Are you Winning an Award in Iceland?
EU Search Awards – Will you be Winning in Iceland?
Post from Bas van den Beld on State of Digital
2014: Are you Winning an Award in Iceland?
The Biggest SERP Flux Since Penguin 2.0
Algoroo has measured the highest recorded level of SERP flux in Google since Penguin 2.0. The jump happened on the 17th of December, following a week of intermittent fluctuation in results.
Our team has already investigated the cause and there doesn’t seem to be a reason to believe this was a layout change or a technical error.…
The post The Biggest SERP Flux Since Penguin 2.0 appeared first on DEJAN SEO.
11 Untapped Content Promotion Strategies
Looking for more traffic and inbound links from your content marketing? Then it’s time to promote it. Here are 11 untapped strategies that you can use to see better results from every blog post, video and guide that you publish.
Easing the Pain of Keyword Not Provided: 5 Tactics for Reclaiming Your Data
Posted by timresnik
October 18th, 2011, the day Google announced “Secure Search,” was a dark day for many search marketers. We had hope, though; we were told only a small fraction of search referrals from Google would apply. This was proven false in just a few weeks as (not provided) quickly hit 10+% for many sites. Then, a year later, seemingly out of the blue, Google started to encrypt almost all searches. Today, we are approaching the dreaded extinction of Google organic keyword data:

Oh keywords, how I will miss thee.
Knowing the keywords that send us traffic from Google Search has always been a major pillar on which search marketers execute and measure the effectiveness of an SEO strategy. With Google “Secure Search” and keywords being stripped from the referral string, it’s starting to look more like a crutch—or worse, a crutch that will very soon no longer exist at all. Here are five ideas and two bonus resources to help nurse keyword targeting and search ROI back to health. Will they solve all your problems? No. Will they inform a direction for future “provided” solutions? Maybe. Are they better than nothing? Most definitely.
1. Use custom variables to tag content with categories/topics
Most web analytics software allows site owners to pass custom variables through. In Google Analytics, a custom variable can be inserted into your code, and as the name implies, you can pass custom name/value pairs of your choice. It’s one of the most useful analytics tools for web traffic segmentation with many different applications. Mix this functionality with category, topics or tags from a page on your site and you can now analyze your organic web traffic based on those variables. If you are discipline and creative in understanding and tagging your content, you will get insight about what topics are sending your traffic.
If you have some programming chops and can extract these variables from your CMS yourself and append them to your tracking code, more power to you! If not, and you are a WordPress user, I have some good news: There is a free plugin from our friends at Yoast. Install it and then simply select the following:

Once it is in GA there are several ways to get at the data. One is to simply go to Acquisition > Channels > Organic Search, then select the primary dimension of “landing page” and the secondary dimension with your custom variable. You now have a list of your landing pages that received organic traffic and the categories/tags related to each. Valuable stuff.

If you want some ideas of what tags you should be using, there are several auto-tag generator plugins for WordPress, Zemanta being one.
Requirements:
-
Programming chops or WordPress and Google Analytics
-
Being disciplined about entering tags and categories
Watch out:
-
It’s human-powered, for better or for worse, and your data is only as good as the humanoid at the controls of your CMS
-
Doesn’t help for long-tail targeting and reporting
2. Combining rank data with landing pages from Google Analytics
We can recapture some Google keywords by joining our rankings and analytics data. Download your rankings data from your favorite rankings tool; the more data you have the better. In Google Analytics, go to Channels > Organic Search > Source = Google and add the secondary dimension of “Landing Page.” View the maximum number of rows and download the data into a CSV. Put your data in two separate tabs in a spreadsheet. Now, all you need to do it join the keywords from the rankings tab with the keywords from the analytics tab. This can be done using VLOOKUP. While you’re at it, add the ranking data to the analytics tab. The end result will look like this:

Requirements:
-
Rankings data
-
Google Analytics data
-
Basic Excel or Google Spreadsheet skills
Watch out:
-
Using the method above with VLOOKUP will only return one keyword per landing page. With some crafty Excel work, you can figure out how to get all the keywords for that page
3. Site search: what users are searching for on your site
If you get enough people using the search feature of your site, it can be a gold mine for keyword data. After all, this keyword data will always be “provided.” Configuring Google Analytics to capture your internal search traffic is pretty straightforward. Once you have done so, you will be able to see the top keywords people are searching for on your site.
Step 1: Open the Google Analytics profile you want to set up Site Search for
Step 2: Navigate to Admin > Settings and scroll to the bottom for “Site Search Settings.” Enter in the parameter that is designated for a search query on your site; for example /search_results.php?q=keyword. If you use a POST-based method and do not pass through a parameter in the URL you can either configure your application to append one, or you can trigger a virtual pageview in your Google Analytics snippet, such as:
analytics.js: ga('send', 'pageview', '/search_results.php?q=keyword')
The category option allows you to look for an additional query parameter that can later be used to group the site search data. For example, if you had search on your site in different sections that you wanted to keep separate: help, content, documentation, etc.

Step 3: Let GA collect some data for a day or so and check out your results. Navigate to Behavior > Site Search > Search Terms to see a complete list that users search for on your site. To dig deep add the secondary dimension of “destination page” to see where the user landed after seeing the search results. Then, be sure to check out the secondary dimension of “search refinement” to see which keywords your users searched for after they searched for the original content. This can clue you into gap between what people are looking for and not finding on your site.

Requirements:
-
A search box on your site
-
Google Analytics
Watch out:
-
It’s a limited data set (on Moz only about 1/2 or one percent of visits end up using our search)
4. Google (and Bing) Webmaster Tools
Google has created the headache with “Not Provided,” but they have also given us a bit of medicine in the form Webmaster Tools. Released a few years back within Webmaster Tools, “Search Queries” provides webmasters with some basic information around their keywords, including average position, impressions, number of clicks, and click-through rate (CTR).

This data should be used, but has a few major limitations. First, only a small, Google-selected subset of the keywords is represented. There is no transparency about how or why they select the keywords, so using it to measure results of specific content optimization efforts can be inaccurate and even misleading.
Second, the data is limited to 90 days. If you ranked for a query 91 days ago, you’ll never know. Webmaster Tools also has an API, but unfortunately the “search queries” data isn’t available through it yet. According to Mr. Cutts, that is imminent. If you want to store your data for longer than 90 days and know how to program, you can use this PHP library or this Python library.
Finally, there is a limitation in how you can use Webmaster Tools data in Google Analytics. The good news is that you can integrate this data into Google Analytics with some basic authentication between the services. The bad news is that you can only segment the data in Google Analytics with 2 dimensions: country and Google property. Joining this data with behavior, demographics, goals, etc. would be extremely valuable.

Requirement:
-
Google Webmaster Tools account
Watch out:
- (Limitations noted above)
5. Deeper topical analysis
Avinash Kaushik, one of my favorite speakers MozCon this year wrote about understanding the “personality” of the page as a future solution for “not provided”. He says:
“I wonder if someone can create a tool that will crawl our site and tell us what the personality of each page represents. Some of this is manifested today as keyword density analysis (which is value-deficient, especially because search engines got over “density” nine hundred years ago). By personality, I mean what does the page stand for, what is the adjacent cluster of meaning that is around the page’s purpose? Based on the words used, what attitude does the page reflect, and based on how others are talking about this page, what other meaning is being implied on a page?”
I think this could be accomplished by performing topical analysis on body content of pages as they are published and then passed through to Google Analytics with custom variables; similar to what I described above with categories. This could be done by using DBpedia and one of the annotation open source application that uses it, such as DBpedia Spotlight. Spotlight detects mentions of terms in your content and scores the relevance of those mentions against structured data created from Wikipedia. Once the topics of the page are “extracted” and passed to your web analytics platform, you’ll be able to use it as a dimension against organic search referrals to landing pages. (Thanks to Jay Leary for walking me through Spotlight)
Bonus: some other “not provided” resources
Mike King is not too worried about “Not Provided.” His deck argues we should be focusing on segmenting our data by personas and affinity groups, and paying more attention to “implicit” rather than “explicit” intent. Good stuff.
Ten industry experts, including two Mozzers, weigh in here and answer a series of questions on the “Not Provided” landscape, including tools and techniques that they use, and even a few “Top Tips for 2014.”
Conclusion
Keyword data from Google organic search is owned and controlled by Google and can never be replaced. Secure Search is here to stay and nearing 100%. There is no cure-all solution. That being said, search marketers are a GSD and generous group, and will continue to hack away at the problem and share solutions. What are some of the data sources and hacks you are using to deal with “not provided?” Are there future algorithmic solutions to this problem, or are we doomed to have to take our Google medicine and be happy with what they decide to provide in Webmaster Tools?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
