Google Mobilepocalypse Update
A day after the alleged major update, I thought it would make sense to highlight where we are at in the cycle.
Yesterday Google suggested their fear messaging caused 4.7% of webmasters to move over to mobile friendly design since the update was origin…
Google Mobilepocalypse Update
A day after the alleged major update, I thought it would make sense to highlight where we are at in the cycle.
Yesterday Google suggested their fear messaging caused 4.7% of webmasters to move over to mobile friendly design since the update was origin…
Consensus Bias
The Truth About Subjective Truths
A few months ago there was an article in New Scientist about Google’s research paper on potentially ranking sites based on how factual their content is. The idea is generally and genuinely absurd.
- You can’t copyright facts, which means that if this were a primary ranking signal & people focused on it then they would be optimizing their site to be scraped-n-displaced into the knowledge graph. Some people may sugar coat the knowledge graph and rich answers as opportunity, but it is Google outsourcing the cost of editorial labor while reaping the rewards.
.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014
- If Google is going to scrape, displace & monetize data sets, then the only ways to really profit are:
- focus on creating the types of content which can’t be easily scraped-n-displaced, or
- create proprietary metrics of your own, such that if they scrape them (and don’t cheat by hiding the source) they are marketing you
- In some areas (especially religion and politics) certain facts are verboten & people prefer things which provide confirmation bias of their pre-existing beliefs. End user usage data creates a “relevancy” signal out of comfortable false facts and personalization reinforces it.
- In some areas well known “facts” are sponsored falsehoods. In other areas some things slip through the cracks.
- In some areas Google changes what is considered fact based on where you are located.
How Google keeps everyone happy pic.twitter.com/KmBBzpfzdf— Amazing Maps (@Amazing_Maps) March 31, 2015
- Those who have enough money can create their own facts. It might be painting the perception of a landscape, hiring thousands of low waged workers to manipulate public perception on key issues and new technologies, or more sophisticated forms of social network analysis and manipulation to manipulate public perceptions.
- The previously mentioned links were governmental efforts. However such strategies are more common in the commercial market. Consider how Google has sponsored academic conferences while explicitly telling the people who put them on to hide the sponsorship as part of their lobbying efforts.
- Then there is the blurry area where government and commerce fuse, like when Google put about a half-dozen home team players in key governmental positions during the FTC investigation of Google. Google claimed lobbying was disgusting until they experienced the ROI firsthand.
- In some areas “facts” are backward looking views of the market which are framed, distorted & intentionally incomplete. There was a significant gap between internal voices and external messaging in the run up to the recent financial crisis. Even large & generally trustworthy organizations have some serious skeletons in their closets.
@mattcutts I wonder, what sort of impact does http://t.co/vdg3ARGSz2 have on their E-A-T? expertise +1, authority +1, trustworthiness -_?— aaron wall (@aaronwall) April 6, 2015
- In other areas the inconvenient facts get washed away over time by money.
For a search engine to be driven primarily by group think (see unity100’s posts here) is the death of diversity.
Less Diversity, More Consolidation
The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: “The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation.”
As companies grow in power the power gets monetized. If you can manipulate people without appearing to do so you can make a lot of money.
If you don’t think Google wants to disrupt you out of a job, you’ve been asleep at the wheel for the past decade— Michael Gray (@graywolf) March 13, 2015
We Just Listen to the Data (Ish)
As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.
Those “data” and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.
- How Google Skewed Search Results
- Inside the U.S. Antitrust Probe of Google
- Key quotes from the document from the WSJ & more from Danny Sullivan
- The PDF document is located here.
That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.
As damning as the above evidence is, more will soon be brought to light as the EU ramps up their formal statement of objection, as Google is less politically connected in Europe than they are in the United States:
“On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. … By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails.”
What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:
“The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers.” – Vauhini Vara
Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people’s spirits in a game of psychological warfare. If that doesn’t hinder consumer choice, what does?
@aaronwall rofl. Feed the dragon Honestly these G investigations need solid long term SEOs to testify as well as brands.— Rishi Lakhani (@rishil) April 2, 2015
When the EU published their statement of objections Google’s response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.
The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.
The long tail of smaller e-commerce sites which have been scrubbed from the search results is nowhere to be seen in such charts / graphs / metrics.
The other obvious “untruth” hidden in the above Google chart is there is no way product searches on Google.com are included in Google’s aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google’s broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.
Who could look at the following search result (during anti-trust competitive review no less) and say “yeah, that looks totally reasonable?”
Google has allegedly spent the last couple years removing “visual clutter” from the search results & yet they manage to product SERPs looking like that – so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.
The Search Results Become a Closed App Store
Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.
That it no longer is.
WOW. RT @aimclear: 89% of domains that ranked over the last 7 years are now invisible, #SEO extinction. SRSLY, @marcustober #SEJSummit— Jonah Stein (@Jonahstein) April 15, 2015
“What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet.” – Dave Pell
The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.
“That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current “algo” consists of thousands of raters that score results for ranking purposes. The “algorithm” by machine, on the majority of results seen by a high percentage of people, is almost non-existent.” … “what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren’t showing serps in serps). That is anticompetitive criteria that was manually set.” – Brett Tabke
The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.
Is Brand the Answer?
About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google’s consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google’s reliance on “data” was a chimera. When convenient (and profitable) data is discarded on an as need basis.
Or, put another way, the visual layout of the search result page trumps the underlying ranking algorithms.
Google has still highly disintermediated brand value, but they did it via vertical search, larger AdWords ad units & allowing competitive bidding on trademark terms.
If Not Illegal, then Scraping is Certainly Morally Deplorable…
As Google scraped Yelp & TripAdvisor reviews & gave them an ultimatum, Google was also scraping Amazon sales rank data and using it to power Google Shopping product rankings.
Around this same time Google pushed through a black PR smear job of Bing for doing a similar, lesser offense to Google on rare, made-up longtail searches which were not used by the general public.
While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use “about 100 “synthetic queries”—queries that you would never expect a user to type” to smear Bing & even numerous of these queries did not show the alleged signal.
Here are some representative views of that incident:
- “We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we’d like for this practice to stop.” – Google’s Amit Singhal
- “It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
- “One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this.” – Matt Cutts
- “I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t.” – Danny Sullivan
What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google’s scraping. I mentioned that contrast shortly after the above PR fiasco happened:
when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don’t want Google scraping them then they should just block Googlebot & kill their search rankings
Learning the Rules of the Road
If you get a sense “the rules” are arbitrary, hypocritical & selectively enforced – you may be on to something:
- “The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them” … which is why … “Google repeatedly changed the instructions for raters until raters assessed Google’s services favorably”
- and while claimping down on those services (“business models to avoid“) … “Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” ”
- and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won’t win unless we can inject a lot more of local directly into google results” … thus they added “a ‘concurring sites’ signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results””
Google’s justification for not being transparent is “spammer” would take advantage of transparency to put inferior results front and center – the exact same thing Google does when it benefits the bottom line!
Around the same time Google hard-codes the self-promotion of their own vertical offerings, they may choose to ban competing business models through “quality” score updates and other similar changes:
The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it’s important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.
- eBook sites that show frequent ads
- ‘Get rich quick’ sites
- Comparison shopping sites
- Travel aggregators
- Affiliates that don’t comply with our affiliate guidelines
The anti-competitive conspiracy theory is no longer conspiracy, nor theory.
Key points highlighted by the European Commission:
- Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
- Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google’s general search results pages.
- Froogle, Google’s first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
- As a result of Google’s systematic favouring of its subsequent comparison shopping services “Google Product Search” and “Google Shopping”, both experienced higher rates of growth, to the detriment of rival comparison shopping services.
- Google’s conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google’s product.
Overcoming Consensus Bias
Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.
Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn’t really matter, as it can be retracted overnight.
Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.
That is how Google reinforces, then manages to overcome consensus bias.
How do you overcome consensus bias?
Consensus Bias
The Truth About Subjective Truths
A few months ago there was an article in New Scientist about Google’s research paper on potentially ranking sites based on how factual their content is. The idea is generally and genuinely absurd.
- You can’t copyright facts, which means that if this were a primary ranking signal & people focused on it then they would be optimizing their site to be scraped-n-displaced into the knowledge graph. Some people may sugar coat the knowledge graph and rich answers as opportunity, but it is Google outsourcing the cost of editorial labor while reaping the rewards.
.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014
- If Google is going to scrape, displace & monetize data sets, then the only ways to really profit are:
- focus on creating the types of content which can’t be easily scraped-n-displaced, or
- create proprietary metrics of your own, such that if they scrape them (and don’t cheat by hiding the source) they are marketing you
- In some areas (especially religion and politics) certain facts are verboten & people prefer things which provide confirmation bias of their pre-existing beliefs. End user usage data creates a “relevancy” signal out of comfortable false facts and personalization reinforces it.
- In some areas well known “facts” are sponsored falsehoods. In other areas some things slip through the cracks.
- In some areas Google changes what is considered fact based on where you are located.
How Google keeps everyone happy pic.twitter.com/KmBBzpfzdf— Amazing Maps (@Amazing_Maps) March 31, 2015
- Those who have enough money can create their own facts. It might be painting the perception of a landscape, hiring thousands of low waged workers to manipulate public perception on key issues and new technologies, or more sophisticated forms of social network analysis and manipulation to manipulate public perceptions.
- The previously mentioned links were governmental efforts. However such strategies are more common in the commercial market. Consider how Google has sponsored academic conferences while explicitly telling the people who put them on to hide the sponsorship as part of their lobbying efforts.
- Then there is the blurry area where government and commerce fuse, like when Google put about a half-dozen home team players in key governmental positions during the FTC investigation of Google. Google claimed lobbying was disgusting until they experienced the ROI firsthand.
- In some areas “facts” are backward looking views of the market which are framed, distorted & intentionally incomplete. There was a significant gap between internal voices and external messaging in the run up to the recent financial crisis. Even large & generally trustworthy organizations have some serious skeletons in their closets.
@mattcutts I wonder, what sort of impact does http://t.co/vdg3ARGSz2 have on their E-A-T? expertise +1, authority +1, trustworthiness -_?— aaron wall (@aaronwall) April 6, 2015
- In other areas the inconvenient facts get washed away over time by money.
For a search engine to be driven primarily by group think (see unity100’s posts here) is the death of diversity.
Less Diversity, More Consolidation
The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: “The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation.”
As companies grow in power the power gets monetized. If you can manipulate people without appearing to do so you can make a lot of money.
If you don’t think Google wants to disrupt you out of a job, you’ve been asleep at the wheel for the past decade— Michael Gray (@graywolf) March 13, 2015
We Just Listen to the Data (Ish)
As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.
Those “data” and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.
- How Google Skewed Search Results
- Inside the U.S. Antitrust Probe of Google
- Key quotes from the document from the WSJ & more from Danny Sullivan
- The PDF document is located here.
That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.
As damning as the above evidence is, more will soon be brought to light as the EU ramps up their formal statement of objection, as Google is less politically connected in Europe than they are in the United States:
“On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. … By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails.”
What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:
“The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers.” – Vauhini Vara
Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people’s spirits in a game of psychological warfare. If that doesn’t hinder consumer choice, what does?
@aaronwall rofl. Feed the dragon Honestly these G investigations need solid long term SEOs to testify as well as brands.— Rishi Lakhani (@rishil) April 2, 2015
When the EU published their statement of objections Google’s response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.
The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.
The long tail of smaller e-commerce sites which have been scrubbed from the search results is nowhere to be seen in such charts / graphs / metrics.
The other obvious “untruth” hidden in the above Google chart is there is no way product searches on Google.com are included in Google’s aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google’s broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.
Who could look at the following search result (during anti-trust competitive review no less) and say “yeah, that looks totally reasonable?”
Google has allegedly spent the last couple years removing “visual clutter” from the search results & yet they manage to product SERPs looking like that – so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.
The Search Results Become a Closed App Store
Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.
That it no longer is.
WOW. RT @aimclear: 89% of domains that ranked over the last 7 years are now invisible, #SEO extinction. SRSLY, @marcustober #SEJSummit— Jonah Stein (@Jonahstein) April 15, 2015
“What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet.” – Dave Pell
The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.
“That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current “algo” consists of thousands of raters that score results for ranking purposes. The “algorithm” by machine, on the majority of results seen by a high percentage of people, is almost non-existent.” … “what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren’t showing serps in serps). That is anticompetitive criteria that was manually set.” – Brett Tabke
The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.
Is Brand the Answer?
About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google’s consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google’s reliance on “data” was a chimera. When convenient (and profitable) data is discarded on an as need basis.
Or, put another way, the visual layout of the search result page trumps the underlying ranking algorithms.
Google has still highly disintermediated brand value, but they did it via vertical search, larger AdWords ad units & allowing competitive bidding on trademark terms.
If Not Illegal, then Scraping is Certainly Morally Deplorable…
As Google scraped Yelp & TripAdvisor reviews & gave them an ultimatum, Google was also scraping Amazon sales rank data and using it to power Google Shopping product rankings.
Around this same time Google pushed through a black PR smear job of Bing for doing a similar, lesser offense to Google on rare, made-up longtail searches which were not used by the general public.
While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use “about 100 “synthetic queries”—queries that you would never expect a user to type” to smear Bing & even numerous of these queries did not show the alleged signal.
Here are some representative views of that incident:
- “We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we’d like for this practice to stop.” – Google’s Amit Singhal
- “It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
- “One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this.” – Matt Cutts
- “I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t.” – Danny Sullivan
What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google’s scraping. I mentioned that contrast shortly after the above PR fiasco happened:
when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don’t want Google scraping them then they should just block Googlebot & kill their search rankings
Learning the Rules of the Road
If you get a sense “the rules” are arbitrary, hypocritical & selectively enforced – you may be on to something:
- “The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them” … which is why … “Google repeatedly changed the instructions for raters until raters assessed Google’s services favorably”
- and while claimping down on those services (“business models to avoid“) … “Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” ”
- and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won’t win unless we can inject a lot more of local directly into google results” … thus they added “a ‘concurring sites’ signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results””
Google’s justification for not being transparent is “spammer” would take advantage of transparency to put inferior results front and center – the exact same thing Google does when it benefits the bottom line!
Around the same time Google hard-codes the self-promotion of their own vertical offerings, they may choose to ban competing business models through “quality” score updates and other similar changes:
The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it’s important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.
- eBook sites that show frequent ads
- ‘Get rich quick’ sites
- Comparison shopping sites
- Travel aggregators
- Affiliates that don’t comply with our affiliate guidelines
The anti-competitive conspiracy theory is no longer conspiracy, nor theory.
Key points highlighted by the European Commission:
- Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
- Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google’s general search results pages.
- Froogle, Google’s first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
- As a result of Google’s systematic favouring of its subsequent comparison shopping services “Google Product Search” and “Google Shopping”, both experienced higher rates of growth, to the detriment of rival comparison shopping services.
- Google’s conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google’s product.
Overcoming Consensus Bias
Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.
Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn’t really matter, as it can be retracted overnight.
Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.
That is how Google reinforces, then manages to overcome consensus bias.
How do you overcome consensus bias?
Designing for Privacy
Information is a commodity. Corporations are passing around consumer behavioral profiles like brokers with stocks, and the vast majority of the American public is none the wiser of this market’s scope. Very few people actually check the permissions portion of the Google Play store page before downloading a new app, and who has time to pore over the tedious 48-page monstrosity that is the iTunes terms and conditions contract?
With the advent of wearables, ubiquitous computing, and widespread mobile usage, the individual’s market share of their own information is shrinking at an alarming rate. In response, a growing (and vocal) group of consumers is voicing its concerns about the impact of the effective end of privacy online. And guess what? It’s up to designers to address those concerns in meaningful ways to assuage consumer demand.
But how can such a Sisyphean feat be managed? In a world that demands personalized service at the cost of privacy, how can you create and manage a product that strikes the right balance between the two?
That’s a million dollar question, so let’s break it into more affordable chunks.
Transparency
The big problem with informed consent is the information. It’s your responsibility to be up front with your users as to what exactly they’re trading you in return for your product/service. Not just the cash flow, but the data stream as well. Where’s it going? What’s it being used for?
99.99% of all smartphone apps ask for permission to modify and delete the contents of a phone’s data storage. 99.9999% of the time that doesn’t mean it’s going to copy and paste contact info, photos, or personal correspondences. But that .0001% is mighty worrisome.
Let your users know exactly what you’re asking from them, and what you’ll do with their data. Advertise the fact that you’re not sharing it with corporate interests to line your pockets. And if you are, well, stop that. It’s annoying and you’re ruining the future.
How can you advertise the key points of your privacy policies? Well, you could take a cue from noted online retailer Zappos.com. Their “PROTECTING YOUR PERSONAL INFORMATION” page serves as a decent template for transparency.
They have clearly defined policies about what they will and won’t do to safeguard shopper information. For one, they promise never to “rent, sell or share” user data to anyone, and immediately below, they link to their privacy policy, which weighs in a bit heavy at over 2500 words, but is yet dwarfed by other more convoluted policies.
They also describe their efforts to safeguard user data from malicious hacking threats through the use of SSL tech and firewalls. Then they have an FAQ addressing commonly expressed security concerns. Finally, they have a 24/7 contact line to assure users of personal attention to their privacy queries.
Now it should be noted that this is a template for a good transparency practices, and not precisely a great example of it. The content and intention is there, so what’s missing?
Good UX.
The fine print is indeed a little too fine, the text is a bit too dense (at least where the actual privacy policy is concerned), and the page itself is buried within the fat footer on the main page.
So who does a better job?
CodePen has actually produces an attractively progressive solution.
As you can see, CodePen has taken the time to produce two different versions of their ToS. A typical, lengthy bit of legalese on the left, and an easily readable layman’s version on the right. Providing these as a side by side comparison shows user appreciation and an emphasis on providing a positive UX.
This is all well and good for the traditional web browsing environment, but most of the problems with privacy these days stem from mobile usage. Let’s take a look at how mobile applications are taking advantage of the lag between common knowledge and current technology to make a profit off of private data.
Mobile Permissions
In the mobile space, the Google Play store does a decent job of letting users know what permissions they’re giving, whenever they download an app with its “Permission details” tab:
As you can see, Instagram is awfully nosy, but that’s no surprise. Instagram has come under fire for their privacy policies before. What’s perhaps more surprising, is the unbelievable ubiquity with which invasive data gathering is practiced in the mobile space. Compare Instagram’s permissions to another popular application you might have added to your smartphone’s repertoire:
Why, pray tell, does a flashlight have any need for your location, photos/media/files, device ID and/or call information? I’ll give you a clue: it doesn’t.
“Brightest Flashlight Free” scoops up personal data and sells it to advertisers. The developer was actually sued in 2013 for having a poorly written privacy policy. One that did not disclose the apps malicious intentions to sell user data.
Now the policy is up to date, but the insidious data gathering and selling continues. Unfortunately, it isn’t the only flashlight application to engage in the same sort of dirty data tactics. The fact is, you have to do a surprising amount of research to find any application that doesn’t grab a bit more data than advertised, especially when the global market for mobile-user data approaches nearly $10 billion.
For your peace of mind, there is at least one example of an aptly named flashlight application which doesn’t sell your personal info to the highest bidder.
But don’t get too enthusiastic just yet. This is just one application. How many do you have downloaded on your smartphone? Chances are pretty good that you’re harboring a corporate spy on your mobile device.
Hell, even the Holy Bible takes your data:
Is nothing sacred? To the App developer’s credit, they’ve expressed publicly that they’ll never sell user data to third party interests, but it’s still a wakeup call.
Privacy and UX
What then, are some UX friendly solutions? Designers are forced to strike a balance. Apps need data to run more efficiently, and to better serve users. Yet users aren’t used to the concerns associated with the wholesale data permissions required of most applications. What kind of design patterns can be implemented to bring in a bit of harmony?
First and foremost, it’s important to be utilitarian in your data gathering. Offering informed consent is important, letting your users know what permissions they’re granting and why, but doing so in performant user flows is paramount.
For example, iOS has at least one up on Android with their “dynamic permissions.” This means iOS users have the option of switching up their permissions in-app, rather than having to decide all or nothing upon installation as with Android apps.
http://techcrunch.com/2014/04/04/the-right-way-to-ask-users-for-ios-permissions/
Note how the Cluster application prompts the user to give access to their photos as their interacting with the application, and reassures them of exactly what the app will do. The user is fully informed, and offers their consent as a result of being asked for a certain level of trust.
All of this is accomplished while they’re aiming to achieve a goal within the app. This effectively moves permission granting to 100% because the developers have created a sense of comfort with the application’s inner workings. That’s what designing for privacy is all about: slowly introducing a user to the concept of shared data, and never taking undue advantage of an uninformed user.
Of course, this is just one facet of the privacy/UX conversation. Informing a user of what they’re allowing is important, but reassuring them that their data is secure is even more so.
Safeguarding User Data
Asking a user to trust your brand is essential to a modern business model, you’re trying to engender a trust based relationship with all of your visitors, after all. The real trick, however, is convincing users that their data is safe in your hands—in other words, it won’t be sold to or stolen by 3rd parties, be they legitimate corporations or malicious hackers.
We touched on this earlier with the Zappos example. Zappos reassures its shoppers with SSL, firewalls, and a personalized promise never to share or sell data. All of which should be adopted as industry standards and blatantly advertised to assuage privacy concerns.
Building these safeguards into your service/application/website/what-have-you is extremely important. To gain consumer trust is to first provide transparency in your own practices, and then to protect your users from the wolves at the gate.
Fortunately, data protection is a booming business with a myriad of effective solutions currently in play. Here are just a few of the popular cloud-based options:
Whatever security solutions you choose, the priorities remain the same. Build trust, and more importantly: actually deserve whatever trust you build.
It hardly needs to be stated, but the real key to a future where personal privacy still exists, is to actually be better people. The kind that can be trusted to hold sensitive data.
Is such a future still possible? Let us know what you think in the comment section.
Kyle Sanders is a member of SEOBook and founder of Complete Web Resources, an Austin-based SEO and digital marketing agency.
Google Mobile Search Result Highlights
Google recently added highlights at the bottom of various sections of their mobile search results. The highlights appear on ads, organic results, and other various vertical search insertion types. The colors vary arbitrarily by section and are pattern…
Responsive Design for Mobile SEO
Why Is Mobile So Important?
If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.
Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google’s mobile ad clicks:
While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume
Those ad networks want other publishers to make their sites mobile friendly for a couple reasons…
- If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
- If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven’t spent resources optimizing user experience on the channel.
Further Elevating the Importance of Mobile
Modern Love, by Banksy. @SachinKalbag pic.twitter.com/Xzcxnkmmnx— Anand Ranganathan (@ARangarajan1972) November 29, 2014
Google has hinted at the importance of having a mobile friendly design, labeling friendly sites, testing labeling slow sites & offering tools to test how mobile friendly a site design is.
Today Google put out an APB warning they are going to increase the importance of mobile friendly website design:
Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.
In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.
I wouldn’t recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.
Those who ignore the warning might be in for significant pain.
Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.
Another related issue referenced in the above post was tying in-app content to mobile search personalization:
Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.
Google also announced today they are extending AdWords-styled ads to their Google Play search results, so they now have a direct economic incentive to allow app activity to bleed into their organic ranking factors.
m. Versus Responsive Design
Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document
<link rel=”alternate” media=”only screen and (max-width: 640px)” href=”http://m.seobook.com/” >
…and then on the mobile version, they would rel=canonical it back to the desktop version, likeso…
<link rel=”canonical” href=”http://www.seobook.com/” >
With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.
3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.
Here are a couple reasons responsive is likely to win out as a better solution:
- If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
- While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
- Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
- When Bing gave their best practices for mobile, they stated: “Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all.” In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn’t the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.
In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout.
Solutions for Quickly Implementing Responsive Design
New Theme / Design
If your site hasn’t been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.
Child Themes
Some of the default Wordpress themes are responsive. Creating a child theme is quite easy. The popular Thesis and Studiopress frameworks also offer responsive skins.
PSD to HTML HTML to Responsive HTML
Eeek! … 11% Of Americans Think #HTML Is A Sexually Transmitted Disease http://t.co/np0irmI1DW via @broderick— L2Code HTML (@L2CodeHTML) January 10, 2015
Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.
If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.
Other Things to Look Out For
Third Party Plug-ins & Ad Code Gotchas
Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.
<style type=”text/css”>
.adslot_1 { display:inline-block; width: 320px; height: 50px; }
@media (max-width: 400px) { .adslot_1 { display: none; } }
@media (min-width:500px) { .adslot_1 { width: 468px; height: 60px; } }
@media (min-width:800px) { .adslot_1 { width: 728px; height: 90px; } }
</style>
<ins class=”adsbygoogle adslot_1″
data-ad-client=”ca-pub-1234″
data-ad-slot=”5678″></ins>
<script async src=”//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js”></script>
<script>(adsbygoogle = window.adsbygoogle || []).push({});</script>
For other ads which perhaps don’t have a “mobile friendly” option you could use CSS to either set the ad unit to display none or to set the ad unit to overflow using code like either of the following
hide it:
@media only screen and (max-width: ___px) {
.bannerad {
display: none;
}
}
overflow it:
@media only screen and (max-width: ___px) {
.ad-unit {
max-width: ___px;
overflow: scroll;
}
}
Before Putting Your New Responsive Site Live…
Back up your old site before putting the new site live.
For static HTML sites or sites with PHP or SHTML includes & such…
- Download a copy of your existing site to local.
- Rename that folder to something like sitename.com-OLDVERSION
- Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
- Download your site to local again.
- Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
- Create a test file with the responsive design on your site & test that page until things work well enough.
- Once that page works well enough, test changing your homepage over to the new design & then save and upload it to verify it works properly. In addition to using your cell phone you could see how it looks on a variety of devices via the mobile browser testing emulation tool in Chrome, or a wide array of third party tools like: MobileTest.me, iPadPeek, Mobile Phone Emulator, Browshot, Matt Kersley’s responsive web design testing tool, BrowserStack, Cross Browser Testing, & the W3C mobileOK Checker. Paid services like Keynote offer manual testing rather than emulation on a wide variety of devices. There is also paid downloadable desktop emulation software like Multi-browser view.
- Once you have the general “what needs changed in each file” down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
- Use a tool like FileZilla to quickly bulk upload the files.
- Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
- If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
If you have little faith in the above test-it-live “methodology” & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.
For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.
If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.
You Can’t Copyright Facts
The Facts of Life
When Google introduced the knowledge graph one of their underlying messages behind it was “you can’t copyright facts.”
Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.
When you search for love quotes, Google pulls one into their site & then provides another “try again” link.
Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn’t able to pull them into the search results either.
The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn’t be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.
Never trust a corporation to do a librarian’s job.
What’s Behind Door Number One?
Google has also done the above quote-like “action item” types of onebox listings in other areas like software downloads
Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person’s location or status as a student or such.
Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here’s a blog post from Malwarebytes referencing
- their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
- numerous user complaints they received about the bundleware
- required legal actions they took to take the bundler offline
Brands are forced to buy their own brand equity before, during & after the purchase, or Google partners with parasites to monetize the brand equity:
The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
…
The ads themselves said things like “McAfee Support – Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
…
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks
Since Google requires Chrome extensions be installed from their own website it makes it hard (for anyone other than Google) to monetize them, which in turn makes it appealing for people to sell the ad-ons to malware bundlers. Android apps in the Google Play store are yet another “open” malware ecosystem.
FACT: This Isn’t About Facts
Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.
When Google’s evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that…
Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”
Even as the people who routinely shill for Google parrot the “you can’t copyright facts” mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.
Even if You Have Copyright…
What makes the “you can’t copyright facts” line so particularly disingenuous was Google’s support of piracy when they purchased YouTube:
cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)
To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn’t so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: “one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search.”
Altruism vs Disruption for the Sake of it
Whenever Google implements a new feature they can choose not to monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.
There was literally no transition period.
Many of the “informational” knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.
Health is Wealth
Google recently went big on the knowledge graph by jumping head first into the health vertical:
starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.
Google’s links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.
In case you didn’t find enough background in Google’s announcement article, Greg Sterling shared more of Google’s views here. A couple notable quotes from Greg…
Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.
- Google doesn’t need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
- If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
- If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those “cynics” to label Greg Sterling as a shill.
Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.
By defunding and displacing something they don’t improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.
If their traffic drops and they don’t do more with less, then…
- their margins will fall
- growth slows (or they may even shrink)
- their stock price will tank
- management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some “bet the company” moves to find growth elsewhere (and hope Google doesn’t enter that parallel area anytime soon)
When the numbers don’t work, publishers need to cut back or cut corners.
Things get monetized directly, monetized indirectly, or they disappear.
Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.
Publishers who were facing an “oh crap” moment when seeing print Dollars turn into digital dimes are having round number 2 when they see those digital dimes turn into mobile pennies:
At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.
If we lose some diversity in news it isn’t great, though it isn’t the end of the world. But what makes health such an important area is it is literally a matter of life & death.
Its importance & the amount of money flowing through the market ensures there is heavy investment in misinforming the general population. The corruption is so bad some people (who should know better) instead fault science.
@johnandrews @aaronwall it must be nice to say, you know what we’re keeping that traffic for ourselves, and nobody says a damn thing— Michael Gray (@graywolf) February 10, 2015
… and, only those who hate free speech, democracy & the country could possibly have anything negative to say about it. :D
Not to worry though. Any user trust built through the health knowledge graph can be monetized through a variety of other fantastic benevolent offers.
Once again, Google puts the user first.
Mozilla Firefox Dumps Google in Favor of Yahoo! Search
Firefox users conduct over 100 billion searches per year & starting in December Yahoo! will be the default search choice in the US, under a new 5 year agreement.
Google has been the Firefox global search default since 2004. Our agreement came up for renewal this year, and we took this as an opportunity to review our competitive strategy and explore our options.
In evaluating our search partnerships, our primary consideration was to ensure our strategy aligned with our values of choice and independence, and positions us to innovate and advance our mission in ways that best serve our users and the Web. In the end, each of the partnership options available to us had strong, improved economic terms reflecting the significant value that Firefox brings to the ecosystem. But one strategy stood out from the rest.
In Russia they’ll default to Yandex & in China they’ll default to Baidu.
One weird thing about that announcement is there is no mention of Europe & Google’s dominance is far greater in Europe. I wonder if there was a quiet deal with Google in Europe, if they still don’t have their Europe strategy in place, or what their strategy is.
Google paid Firefox roughly $300 million per year for the default search placement. Yahoo!’s annual search revenue is on the order of $1.8 billion per year, so if they came close to paying $300 million a year, then Yahoo! has to presume they are going to get at least a few percentage points of search marketshare lift for this to pay for itself.
It also makes sense that Yahoo! would be a more natural partner fit for Mozilla than Bing would. If Mozilla partnered with Bing they would risk developer blowback from pent up rage about anti-competitive Internet Explorer business practices from 10 or 15 years ago.
It is also worth mentioning our recent post about how Yahoo! boosts search RPM by doing about a half dozen different tricks to preference paid search results while blending in the organic results.
Yahoo Ads | Yahoo Organic Results | |
Placement | top of the page | below the ads |
Background color | none / totally blended | none |
Ad label | small gray text to right of advertiser URL | n/a |
Sitelinks | often 5 or 6 | usually none, unless branded query |
Extensions | star ratings, etc. | typically none |
Keyword bolding | on for title, description, URL & sitelinks | off |
Underlines | ad title & sitelinks, URL on scroll over | off |
Click target | entire background of ad area is clickable | only the listing title is clickable |
Though the revenue juicing stuff from above wasn’t present in the screenshot Mozilla shared about Yahoo!’s new clean search layout they will offer Firefox users.
It shows red ad labels to the left of the ads and bolding on both the ads & organics.
Here is Marissa Mayer’s take:
At Yahoo, we believe deeply in search – it’s an area of investment and opportunity for us. It’s also a key growth area for us – we’ve now seen 11 consecutive quarters of growth in our search revenue on an ex-TAC basis. This partnership helps to expand our reach in search and gives us an opportunity to work even more closely with Mozilla to find ways to innovate in search, communications, and digital content. I’m also excited about the long-term framework we developed with Mozilla for future product integrations and expansion into international markets.
Our teams worked closely with Mozilla to build a clean, modern, and immersive search experience that will launch first to Firefox’s U.S. users in December and then to all Yahoo users in early 2015.
Even if Microsoft is only getting a slice of the revenues, this makes the Bing organic & ad ecosystem stronger while hurting Google. (Unless of course this is a step 1 before Marissa finds a way to nix the Bing deal and partner back up with Google on search). Yahoo! already has a partnership to run Google contextual ads. A potential Yahoo! Google search partnership was blocked back in 2008. Yahoo! also syndicates Bing search ads in a contextual format to other sites through Media.net and has their Gemini Stream Ads product which powers some of their search ads on mobile devices and on content sites is a native ad alternative to Outbrain and Taboola. When they syndicate the native ads to other sites, the ads are called Yahoo! Recommends.
Both Amazon and eBay have recently defected (at least partially) from the Google ad ecosystem. Amazon has also been pushing to extend their ad network out to other sites.
Greg Sterling worries this might be a revenue risk for Firefox: “there may be some monetary risk for Firefox in leaving Google.” Missing from that perspective:
- How much less Google paid Mozilla before the most recent contract lifted by a competitive bid from Microsoft
- If Bing goes away, Google will drastically claw down on the revenue share offered to other search partners.
- Google takes 45% from YouTube publishers
- Google took over a half-decade (and a lawsuit) to even share what their AdSense revenue share was
- look at eHow’s stock performance
- While Google’s search ad revenue has grown about 20% per year their partner ad network revenues have stagnated as their traffic acquisition costs as a percent of revenue have dropped
The good thing about all the Google defections is the more networks there are the more opportunities there are to find one which works well / is a good fit for whatever you are selling, particularly as Google adds various force purchased junk to their ad network – be it mobile “Enhanced” campaigns or destroying exact match keyword targeting.
Peak Google? Not Even Close
Search vs Native Ads
Google owns search, but are they a one trick pony?
A couple weeks ago Ben Thompson published an interesting article suggesting Google may follow IBM and Microsoft in peaking, perhaps with native ads becoming more dominant than online search ads.
According to Forrester, in a couple years digital ad spend will overtake TV ad spend. In spite of the rise of sponsored content, native isn’t even broken out as a category.
Part of the issue with native advertising is it can be blurry to break out some of it. Some of it is obvious, but falls into multiple categories, like video ads on YouTube. Some of it is obvious, but relatively new & thus lacking in scale. Amazon is extending their payment services & Prime shipping deals to third party sites of brands like AllSaints & listing inventory from those sites on Amazon.com, selling them traffic on a CPC basis. Does that count as native advertising? What about a ticket broker or hotel booking site syndicating their inventory to a meta search site?
And while native is not broken out, Google already offers native ad management features in DoubleClick and has partnered with some of the more well known players like BuzzFeed.
The Penny Gap’s Impact on Search
Time intends to test paywalls on all of its major titles next year & they are working with third parties to integrate affiliate ads on sites like People.com.
The second link in the above sentence goes to an article which is behind a paywall. On Twitter I often link to WSJ articles which are behind a paywall. Any important information behind a paywall may quickly spread beyond it, but typically a competing free site which (re)reports on whatever is behind the paywall is shared more, spreads further on social, generates more additional coverage on forums and discussion sites like Hacker News, gets highlighted on aggregators like TechMeme, gets more links, ranks higher, and becomes the default/canonical source of the story.
Part of the rub of the penny gap is the cost of the friction vastly exceeds the financial cost. Those who can flow attention around the payment can typically make more by tracking and monetizing user behavior than they could by charging users incrementally a cent here and a nickel there.
Well known franchises are forced to offer a free version or they eventually cede their market position.
There are sites which do roll up subscriptions to a variety of sites at once, but some of them which had stub articles requiring payment to access like Highbeam Research got torched by Panda. If the barrier to entry to get to the content is too high the engagement metrics are likely to be terrible & a penalty ensues. Even a general registration wall is too high of a barrier to entry for some sites. Google demands whatever content is shown to them be visible to end users & if there is a miss match that is considered cloaking – unless the miss match is due to monetizing by using Google’s content locking consumer surveys.
Who gets to the scale needed to have enough consumer demand to be able to charge an ongoing subscription for access to a variety of third party content? There are a handful of players in music (Apple, Spotify, Pandora, etc) & a handful of players in video (Netflix, Hulu, Amazon Prime), but outside of those most paid subscription sites are about finance or niche topics with small scale. And whatever goes behind the paywalls gets seen by almost nobody when compared against to the broader public market at the free pricepoint.
Even if you are in a broad niche industry where a subscription-based model works, it still may be brutally tough to compete against Google. Google’s chief business officer joined the board of Spotify, which means Spotify should be safe from Google risk, except…
- In spite of billions of dollars of aggregate royalty payouts by Spotify, Taylor Swift pulled her catalog from Spotify
- shortly after Taylor Swift pulled her catalog from Spotify, YouTube announced their subscription service, which will include Taylor Swift’s catalog & will offer a free 6-month trial
Google’s Impact on Premium Content
I’ve long argued Google has leveraged piracy to secure favorable media deals (see the second bullet point at the bottom of this infographic). Some might have perceived my take as being cynical, but when Google mentioned their “continued progress on fighting piracy” the first thing they mentioned was more ad units.
There are free options, paid options & the blurry lines in between which Google & YouTube ride while they experiment with business models and monetize the flow of traffic to the paid options.
“Tech companies don’t believe in the unique value of premium content over the long term.” – Jessica Lessin
There is a massive misalignment of values which causes many players to have to refine their strategy over and over again. The gray area is never static.
Many businesses only have a 10% or 15% profit margin. An online publishing company which sees 20% of its traffic disappear might thus go from sustainable to bleeding cash overnight. A company which can arbitrarily shift / redirect a large amount of traffic online might describe itself as a “kingmaker.”
In Germany some publishers wanted to be paid to be in the Google index. As a result Google stopped showing snippets near their listings. Google also refined their news search integration into the regular search results to include a broader selection of sources including social sites like Reddit. As a result Axel Springer quickly found itself begging for things to go back to the way they were before as their Google search traffic declined 40% and their Google News traffic declined 80%. Axel Springer got their descriptions back, but the “in the news” change remains.
Google’s Impact on Weaker Players
If Google could have that dramatic of an impact on Axel Springer, imagine what sort of influence they have on millions of other smaller and weaker online businesses.
One of the craziest examples is Demand Media.
Demand Media’s market cap peaked above $1.9 billion. They spun out the domain name portion of the business into a company named Rightside Group, but the content portion of the business is valued at essentially nothing. They have about $40 million in cash & equivalents. Earlier this year they acquired Saatchi Art for $17 million & last year they acquired ecommerce marketplace Society6 for $94 million. After their last quarterly result their stock closed down 16.83% & Thursday they were down another 6.32%, given them a market capitalization of $102 million.
On their most recent conference call, here are some of the things Demand Media executives stated:
- By the end of 2014, we anticipate more than 50.000 articles will be substantially improved by rewrites made rich with great visuals.
- We are well underway with this push for quality and will remove $1.8 million underperforming articles in Q4
- as we strive to create the best experience we can we have removed two ad units with the third unit to be removed completely by January 1st
- (on the above 2 changes) These changes are expected to have a negative impact on revenues and adjusted EBITDA of approximately $15 million on an annualized basis.
- Through Q3 we have invested $1.1 million in content renovation costs and expect approximately another $1 million in Q4 and $2 million to $4 million in the first half of next year.
- if you look at visits or you know the mobile mix is growing which has lower CPM associated with it and then also on desktop we’re seeing compression on the pricing line as well.
- we know that sites that have ad density that’s too high, not only are they offending audiences in near term, you are penalized with respect to where you show up in search indexes as well.
Google torched eHow in April of 2011. In spite of over 3 years of pain, Demand Media is still letting Google drive their strategy, in some cases spending millions of dollars to undo past investments.
Yet when you look at Google’s search results page, they are doing the opposite of the above strategy: more scraping of third party content coupled with more & larger ad units.
“@CyrusShepard: Google’s got you covered: pic.twitter.com/ZU2AOl5EGn” Google copies entire web page?— john andrews (@searchsleuth999) October 30, 2014
Originally the source links in the scrape-n-displace program were light gray. They only turned blue after a journalist started working on a story about 10 blue links.
The Blend
The search results can be designed to have some aspects blend in while others stand out. Colors can change periodically to alter response rates in desirable ways.
The banner ad got a bad rap as publishers have fought declining CPMs by adding more advertisements to their pages. When it works, Google’s infrastructure still delivers (and tracks) billions of daily banner ads.
Search ads have never had the performance decline banner ads have.
The closest thing Google ever faced on that front was when AdBlock Plus became popular. Since it was blocking search ads, Google banned them & then restored them as they eventually negotiated a deal to pay them to display ads on Google while they continued to block ads on other third party sites.
Search itself *is* the ultimate native advertising platform.
Google is doing away with the local carousel in favor of a 3 pack local listing in categories like hotels. Once a person clicks on one of the hotel listings, Google then inserts an inline knowledge graph listing for that hotel with booking affiliate links inline in the search results, displacing the organic result set below the fold.
Notice in the above graphic how the “website” link uses generic text, is aligned toward the right, and is right next to an image so that it looks like an ad. It is engineered to feel like an ad and be ignored. The actual ads are left aligned and look like regular text links. They have an ad label, but that label is a couple lines up from them & there are multiple horizontal lines between the label and the actual ad units.
Not only does Google have the ability to shift the layout in such a drastic format, but then with whatever remains they also get to determine who they act against & who they don’t. While the SEO industry debates the “ethics” of various marketing techniques Google has their eye on the prize & is displacing the entire ecosystem wholesale.
Users were generally unable to distinguish between ads and organic listings *before* Google started mixing the two in their knowledge graph. That is a big part of the reason search ads have never seen the engagement declines banner ads have seen.
Mobile has been redesigned with the same thinking in mind. Google action items (which can eventually be monetized) up top & everything else pushed down the page.
The blurring of “knowledge” and ads allows Google to test entering category after category (like doctor calls from the search results) & forcing advertisers to pay for the various tests while Google collects data.
And as Google is scraping information from third party sites, they can choose to show less information on their own site if doing so creates additional monetization opportunities. As far back as 2009 Google stripped phone numbers off of non-sponsored map listings. And what happened with the recent 3 pack? While 100% of the above the fold results are monetized, …
“Go back to an original search that turns up the 3 PAC. Its completely devoid of logical information that a searcher would want:
- No phone number
- No address
- No map
- NO LINK to the restaurant website.
Anything that most users would want is deliberately hidden and/or requires more clicks.” – Dave Oremland
Google justifies their scrape-n-displace programs by claiming they are put users first. Then they hide some of the information to drive incremental monetization opportunities. Google may eventually re-add some of those basic features which are now hidden, but as part of sponsored local listings.
After all – ads are the solution to everything.
Do branded banner ads in the search results have a low response rate? Or are advertisers unwilling to pay a significant sum for them? If so, Google can end the test and instead shift to include a product carousel in the search results, driving traffic to Google Shopping.
“I see this as yet another money grab by Google. Our clients typically pay 400-500% more for PLA clicks than for clicks on their PPC Brand ads. We will implement exact match brand negatives in Shopping campaigns.” – Barb Young
That money grab stuff has virtually no limit.
The Click Casino
Off the start keywords defaulted to broad match. Then campaigns went “enhanced” so advertisers were forced to eat lower quality clicks on mobile devices.
Then there was the blurring exact match targeting into something else, forcing advertisers to buy lower quality variations of searches & making them add tons of negative keywords (and keep eating new garbage re-interpretations of words) in order to run a fine tuned campaign specifically targeted against a term.
In the past some PPC folks cheered obfuscation of organic search, thinking “this will never happen to me.”
Oops.
And of course Google not only wants to be the ad auction, but they want to be your SEM platform managing your spend & they are suggesting you can leverage the “power” of automated auction time biding.
Advertisers RAVE about the success of Google’s automatic bidding features: “It received one click. That click cost $362.63.”
The only thing better than that is banner ads in free mobile tap games targeted at children.
Adding Friction
Above I mentioned how Google arbitrarily banned the AdBlock Plus extension from the Play store. They also repeatedly banned Disconnect Mobile. If you depend on mobile phones for distribution it is hard to get around Google. What’s more they also collect performance data, internally launch competing apps, and invest in third party apps. And they control the prices various apps pay for advertising across their broad network.
So maybe you say “ok, I’ll skip search & mobile, I’ll try to leverage email” but this gets back to the same issue again. In Gmail social or promotional emails get quarantined into a ghetto where they are rarely seen:
“online stores, if they get big enough, can act as chokepoints. And so can Google. … Google unilaterally misfiled my daily blog into the promotions folder they created, and I have no recourse and no way (other than this post) to explain the error to them” – Seth Godin
Those friction adders have real world consequences. A year ago Groupon blamed Gmail’s tabs for causing them to have poor quarterly results. The filtering impact on a start up can be even more extreme. A small shift in exposure can lower the K factor to something below 1 & require the startups to buy exposure rather than generating it virally.
In addition to those other tabs, there are a host of other risks like being labeled as spam or having a security warning. Few sites promote Google’s view as often as people like Greg Sterling do on Search Engine Land, yet even their newsletter was showing a warning in Gmail.
Google can also add friction to
- websites using search rankings, vertical search result displacement, hiding local business information (as referenced above), search query suggestions, and/or leveraging their web browser to redirect consumer intent
- video on YouTube by counting ad views as organic views, changing the relevancy metrics, and investing in competing channels & giving them preferential exposure as part of the deal. YouTube gets over half their views on mobile devices with small screens, so any shift on Google’s rank preference is going to have a major shift in click distributions.
- mobile apps using default bundling agreements which require manufactures to set Google’s apps as defaults
- other business models by banning bundling-based business models too similar to their own (bundling is wrong UNLESS it is Google doing it)
- etc.
The launch of Keyword (not provided) which hid organic search keyword data was friction for the sake of it in organic search. When Google announced HTTPS as a ranking signal, Netmeg wrote: “It’s about ad targeting, and who gets to profile you, and who doesn’t. Mark my words.”
Facebook announced their relaunch of Atlas and Google immediately started cracking down on data access:
In the conversations this week, with companies like Krux, BlueKai and Lotame, Google told data management platform players that they could not use pixels in certain ads. The pixels—embedded within digital ads—help marketers target and understand how many times a given user has seen their messages online.
“Google is only allowing data management platforms to fire pixels on creative assets that they’re serving, on impressions they bought, through the Google Display Network,” said Mike Moreau, chief solutions officer at Krux. “So they’re starting with a very narrow scope.”
Around the same time Google was cracking down on data sharing, they began offering features targeting consumers across devices & announced custom affinity audiences which allow advertisers to target audiences who visit any particular website.
Google’s special role is not only as an organizer (and obfuscate) of information, but then they get to be the source measuromg how well marketing works via their analytics, which can regularly launch new reports which may causually over-represent their own contribution while under-representing some other channels, profiting from activity bias. The industry default of last click attribution driving search ad spending is one of the key issues which has driven down display ad values over the years.
Investing in Competition
Google not only ranks the ecosystem, but they actively invest in it.
Google tried to buy Yelp. When Facebook took off Google invested in Zynga to get access to data, in spite of a sketchy background. When Google’s $6 billion offer for Groupon didn’t close the deal, Google quickly partnered with over a dozen Groupon competitors & created new offer ad units in the search results.
Inside of the YouTube ecosystem Google also holds equity stakes in leading publishers like Machinima and Vevo.
There have been a few examples of investments getting special treatment, getting benefit of the doubt, or access to non-public information.
The scary scenario for publishers might sound something like this: “in Baidu Maps you can find a hotel, check room availability, and make a booking, all inside the app.” There’s no need to leave the search engine.
Take a closer look & that scary version might already be here. Google’s same day delivery boss moved to Uber and Google added Uber pickups and price estimates to their mobile Maps app.
Google, of course, also invested in Uber. It would be hard to argue that Uber is anything but successful. Though it is also worth mentioning winning at any cost often creates losses elsewhere:
- their insurance situation sounds fuzzy
- the founder doesn’t sound like a nice guy
- they’ve repeatedly engaged in shady endeavors like flooding a competing network with bogus requests & trying to screw with a competitor’s ability to raise funds
- as subprime auto loans have become widespread, Uber has pushed an extreme version of them onto their drivers while cutting their rates – an echo of the utopia of years gone by.
Google invests in disruption as though disruption is its own virtue & they leverage their internal data to drive the investments:
“If you can’t measure and quantify it, how can you hope to start working on a solution?” said Bill Maris, managing partner of Google Ventures. “We have access to the world’s largest data sets you can imagine, our cloud computer infrastructure is the biggest ever. It would be foolish to just go out and make gut investments.”
Combining usage data from their search engine, web browser, app store & mobile OS gives them unparalleled insights into almost any business.
Google is one of the few companies which can make multi-billion dollar investments in low margin areas, just for the data:
Google executives are prodding their engineers to make its public cloud as fast and powerful as the data centers that run its own apps. That push, along with other sales and technology initiatives, aren’t just about grabbing a share of growing cloud revenue. Executives increasingly believe such services could give Google insights about what to build next, what companies to buy and other consumer preferences
Google committed to spending as much as a half billion dollars promoting their shopping express delivery service.
Google’s fiber push now includes offering business internet services. Elon Musk is looking into offering satellite internet services – with an ex-Googler.
The End Game
Google now spends more than any other company on federal lobbying in the US. A steady stream of Google executives have filled US government rolls like deputy chief technology officer, chief technology officer, and head of the patent and trademark office. A Google software engineer went so far as suggesting President Obama
- Retire all government employees with full pensions.
- Transfer administrative authority to the tech industry.
- Appoint Eric Schmidt CEO of America.
That Googler may be crazy or a troll, but even if we don’t get their nightmare scenario, if the regulators come from a particular company, that company is unlikely to end up hurt by regulations.
President Obama has stated the importance of an open internet: “We cannot allow Internet service providers to restrict the best access or to pick winners and losers in the online marketplace for services and ideas.”
If there are relevant complaints about Google, who will hear them when Googlers head key government roles?
Larry Page was recently labeled businessperson of the year by Fortune:
It’s a powerful example of how Page pushes the world around him into his vision of the future. “The breadth of things that he is taking on is staggering,” says Ben Horowitz, of Andreessen Horowitz. “We have not seen that kind of business leader since Thomas Edison at GE or David Packard at HP.”
A recent interview of Larry Page in the Financial Times echos the theme of limitless ambition:
- “the world’s most powerful internet company is ready to trade the cash from its search engine monopoly for a slice of the next century’s technological bonanza.” … “As Page sees it, it all comes down to ambition – a commodity of which the world simply doesn’t have a large enough supply.”
- “I think people see the disruption but they don’t really see the positive,” says Page. “They don’t see it as a life-changing kind of thing . . . I think the problem has been people don’t feel they are participating in it.”
- “Even if there’s going to be a disruption on people’s jobs, in the short term that’s likely to be made up by the decreasing cost of things we need, which I think is really important and not being talked about.”
- “in a capitalist system, he suggests, the elimination of inefficiency through technology has to be pursued to its logical conclusion.”
There are some dark layers which are apparently “incidental side effects” of the techno-utopian desires.
Mental flaws could be reinforced & monetized by hooking people on prescription pharmaceuticals:
It takes very little imagination to foresee how the kitchen mood wall could lead to advertisements for antidepressants that follow you around the Web, or trigger an alert to your employer, or show up on your Facebook page because, according to Robert Scoble and Shel Israel in Age of Context: Mobile, Sensors, Data and the Future of Privacy, Facebook “wants to build a system that anticipates your needs.”
Or perhaps…
Those business savings are crucial to Rifkin’s vision of the Third Industrial Revolution, not simply because they have the potential to bring down the price of consumer goods, but because, for the first time, a central tenet of capitalism—that increased productivity requires increased human labor—will no longer hold. And once productivity is unmoored from labor, he argues, capitalism will not be able to support itself, either ideologically or practically.
That is not to say “all will fail” due to technology. Some will succeed wildly.
Michelle Phan has been able to leverage her popularity on YouTube to launch a makeup subscription service which is at an $84 million per year revenue run rate.
Those at the top of the hierarchy will get an additional boost. Such edge case success stories will be marketed offline to pull more people onto the platform.
Google is promoting Japanese Youtube creators’ personal brands in heavy TV, online, and print ads. Trying to recruit more maybe?— Patrick McKenzie (@patio11) November 1, 2014
While a “star based” compensation system makes a few people well off, most people publishing on those platforms won’t see any financial benefit from their efforts. Worse yet, a lot of the “viral” success stories are driven by a large ad budget.
Category after category gets commoditized, platform after platform gets funded by Google, and ultimately employees working on them will long for the days where their wages were held down by illegal collusion rather than the crowdsourcing fate they face:
Workers, in turn, have more mobility and a semblance of greater control over their working lives. But is any of it worth it when we can’t afford health insurance or don’t know how much the next gig might pay, or when it might come? When an app’s terms of service agreement is the closest thing we have to an employment contract? When work orders come through a smartphone and we know that if we don’t respond immediately, we might not get such an opportunity again? When we can’t even talk to another human being about the task at hand and we must work nonstop just to make minimum wage?
Just as people get commoditized, so do other layers of value:
- those who failed to see the value of & invest in domain names cheered as names lost value
- Google philosophically does not believe in customer service, but they feel they should grade others on their customer service and remove businesses which don’t respond well to culturally unaware outsourced third world labor.
- Google gets people to upload pirate content to YouTube & then uses the existence of the piracy to sell ads. They can also use YouTube to report on the quality of various ISPs, while also blocking competing search engines and apps from being able to access YouTube.
- PageRank extracts the value of editorial votes, allowing Google to leverage the work of editors to refine their search rankings without requiring people to visit the sites. Mix in the couple years Google spent fearmongering about links and it is not surprising to see the Yahoo Directory shut down.
- Remember the time a Google “contractor” did a manual scrape of Mocality in Kenya, then called up the businesses and lied to them to try to get them onto Google? I’d link to the original Mocality blog post, but Mocality Kenya has shut down.
- Google also scraped reviews (sometimes without attribution) & only failed at it when players large enough to be heard by Government regulators complained. This is why government scrutiny of Google is so important. It is one of the few things they actually respond to.
In SEO for a number of years many people have painted brand as the solution to everything. But consider the hotel search results which are 100% monetized above the fold – even if you have a brand, you still must pay to play. Or consider the Google Shopping ads which are now being tested on branded navigational searches.
Google even obtained a patent for targeting ads aimed at monetizing named entities.
You paid to build the brand. Then you pay Google again – “or else.”
One could choose to opt out of Google ad products so as not to pay to arbitrage themselves, but Google is willing to harm their own relevancy to extract revenues.
A search in the UK for the trademark term [cheapflights] is converted into the generic search [cheap flights]. The official site is ranking #2 organically and is the 20th clickable link in the left rail of the search results.
As much as brand is an asset, it also becomes a liability if you have to pay again for every time someone looks for your brand.
Mobile apps may be a way around Google, but again it is worth noting Google owns the operating system and guarantees themselves default placement across a wide array of verticals through bundling contracts with manufacturers. Another thing worth considering with mobile is new notification features tied to the operating systems are unbundling apps & Google has apps like Google Now which tie into many verticals.
As SEOs for a long time we had value in promoting the adoption of Google’s ecosystem. As Google attempts to capture more value than they create we may no longer gain by promoting the adoption of their ecosystem, but given their…
- cash hoard
- lobbyists
- ex-employees in key government rolls
- control over video, mobile, apps, maps, email, analytics (along with search)
- broad portfolio of investments
… it is hard to think they’ve come anywhere close to peaking.
How Does Yahoo! Increase Search Ad Clicks?
One wonders how Yahoo Search revenues keep growing even as Yahoo’s search marketshare is in perpetual decline.
Then one looks at a Yahoo SERP and quickly understands what is going on.
Here’s a Yahoo SERP test I saw this morning
I sometimes play a “spot the difference” game with my wife. She’s far better at it than I am, but even to a blind man like me there are about a half-dozen enhancements to the above search results to juice ad clicks. Some of them are hard to notice unless you interact with the page, but here’s a few of them I noticed…
Yahoo Ads | Yahoo Organic Results | |
Placement | top of the page | below the ads |
Background color | none / totally blended | none |
Ad label | small gray text to right of advertiser URL | n/a |
Sitelinks | often 5 or 6 | usually none, unless branded query |
Extensions | star ratings, etc. | typically none |
Keyword bolding | on for title, description, URL & sitelinks | off |
Underlines | ad title & sitelinks, URL on scroll over | off |
Click target | entire background of ad area is clickable | only the listing title is clickable |
What is even more telling about how Yahoo disadvantages the organic result set is when one of their verticals is included in the result set they include the bolding which is missing from other listings. Some of their organic result sets are crazy with the amount of vertical inclusions. On a single result set I’ve seen separate “organic” inclusions for
- Yahoo News
- stories on Yahoo
- Yahoo Answers
They also have other inclusions like shopping search, local search, image search, Yahoo screen, video search, Tumblr and more.
Here are a couple examples.
This one includes an extended paid affiliate listing with SeatGeek & Tumblr.
This one includes rich formatting on Instructibles and Yahoo Answers.
This one includes product search blended into the middle of the organic result set.
Google SEO Services (BETA)
When Google acquired DoubleClick Larry Page wanted to keep the Performics division offering SEM & SEO services just to see what would happen. Other Google executives realized the absurd conflict of interest and potential anti trust issues, so they overrode ambitious Larry: “He wanted to see how those things work. He wanted to experiment.”
Webmasters have grown tired of Google’s duplicity as the search ecosystem shifts to pay to play, or go away.
@davidiwanow I understand the problem, just not the complaints. Google won. Find another oppty, or pay Google. Simple.— john andrews (@searchsleuth999) November 5, 2014
Google’s webmaster guidelines can be viewed as reasonable and consistent or as an anti-competitive tool. As Google eats the ecosystem, those thrown under the bus shift their perspective.
Scraping? AOG (unless we do it) Affiliate? Fucking scumbags mainly AOG (unless we get into the space) Thin content? AOG (unless we do it)— Rae Hoffman (@sugarrae) November 5, 2014
Within some sectors larger players can repeatedly get scrutiny for the same offense with essentially no response, whereas smaller players operating in that same market are slaughtered because they are small.
At this point, Google should just come out and be blunt, “any form of promotion that does not involve paying us is against our guidelines.”— Rae Hoffman (@sugarrae) November 5, 2014
Access to lawyers, politicians & media outlets = access to benefit of the doubt.
Lack those & BEST OF LUCK TO YOU ;)
And most of all, I’m tired of having to tell SMBs that Google gives zero fucks when it comes to them— Rae Hoffman (@sugarrae) November 5, 2014
Google’s page asking “Do you need an SEO?” uses terms like: scam, illicit and deceptive to help frame the broader market perception of SEO.
If ranking movements appear random & non-linear then it is hard to make sense of continued ongoing investment. The less stable Google makes the search ecosystem, the worse they make SEOs look, as…
- anytime a site ranks better, that anchors the baseline expectation of where rankings should be
- large rank swings create friction in managing client communications
- whenever search traffic falls drastically it creates real world impacts on margins, employment & inventory levels
Matt Cutts stated it is a waste of resources for him to be a personal lightning rod for criticism from black hat SEOs. When Matt mentioned he might not go back to his old role at Google some members of the SEO industry were glad. In response some other SEOs mentioned black hats have nobody to blame but themselves & it is their fault for automating things.
After all, it is not like Google arbitrarily shifts their guidelines overnight and drastically penalizes websites to a disproportionate degree ex-post-facto for the work of former employees, former contractors, mistaken/incorrect presumed intent, third party negative SEO efforts, etc.
Oh … wait … let me take that back.
Indeed Google DOES do that, which is where much of the negative sentiment Matt complained about comes from.
Recall when Google went after guest posts, a site which had a single useful guest post on it got a sitewide penalty.
Around that time it was noted Auction.com had thousands of search results for text which was in some of their guest posts.
Enjoying Aaron murdering http://t.co/UadnmwekM7 RT @aaronwall: “about 9,730 results” http://t.co/Sms5L2BFGY— Brian Provost (@brianprovost) April 9, 2014
About a month before the guest post crack down, Auction.com received a $50 million investment from Google Capital.
- Publish a single guest post on your site = Google engineers shoot & ask questions later.
- Publish a duplicated guest post on many websites, with Google investment = Google engineers see it as a safe, sound, measured, reasonable, effective, clean, whitehat strategy.
The point of highlighting that sort of disconnect was not to “out” someone, but rather to highlight the (il)legitimacy of the selective enforcement. After all, …
@mvandemar @brianprovost if anyone should have the capital needed to “do things the right way, as per G” it should be G & those G invests in— aaron wall (@aaronwall) April 9, 2014
But perhaps Google has decided to change their practices and have a more reasonable approach to the SEO industry.
An encouraging development on this front was when Auction.com was once again covered in Bloomberg. They not only benefited from leveraging Google’s data and money, but Google also offered them another assist:
Closely held Auction.com, which is valued at $1.2 billion, based on Google’s stake, also is working with the Internet company to develop mobile and Web applications and improve its search-engine optimization for marketing, Sharga said.
“In a capitalist system, [Larry Page] suggests, the elimination of inefficiency through technology has to be pursued to its logical conclusion.” ― Richard Waters
With that in mind, one can be certain Google didn’t “miss” the guest posts by Auction.com. Enforcement is selective, as always.
“The best way to control the opposition is to lead it ourselves.” ― Vladimir Lenin
Whether you turn left or right, the road leads to the same goal.
Use Verticals To Increase Reach
In the last post, we looked at how SEO has always been changing, but one thing remains constant – the quest for information.
Given people will always be on a quest for information, and given there is no shortage of information, but there is limited time, then there will always be a marketing imperative to get your information seen either ahead of the competition, or in places where the competition haven’t yet targeted.
Channels
My take on SEO is broad because I’m concerned with the marketing potential of the search process, rather than just the behaviour of the Google search engine. We know the term SEO stands for Search Engine Optimization. It’s never been particularly accurate, and less so now, because what most people are really talking about is not SEO, but GO.
Google Optimization.
Still, the term SEO has stuck. The search channel used to have many faces, including Alta Vista, Inktomi, Ask, Looksmart, MSN, Yahoo, Google and the rest, hence the label SEO. Now, it’s pretty much reduced down to one. Google. Okay, there’s BingHoo, but really, it’s Google, 24/7.
We used to optimize for multiple search engines because we had to be everywhere the visitor was, and the search engines had different demographics. There was a time when Google was the choice of the tech savvy web user. These days, “search” means “Google”. You and your grandmother use it.
But people don’t spend most of their time on Google.
Search Beyond Google
The techniques for SEO are widely discussed, dissected, debated, ridiculed, encouraged and we’ve heard all of them, many times over. And that’s just GO.
The audience we are trying to connect with, meanwhile, is on a quest for information. On their quest for information, they will use many channels.
So, who is Google’s biggest search competitor? Bing? Yahoo?
Eric Schmidt thinks it’s Amazon:
Many people think our main competition is Bing or Yahoo,” he said during a visit to a Native Instruments, software and hardware company in Berlin. “But, really, our biggest search competitor is Amazon. People don’t think of Amazon as search, but if you are looking for something to buy, you are more often than not looking for it on Amazon….Schmidt noted that people are looking for a different kind of answers on Amazon’s site through the slew of reviews and product pages, but it’s still about getting information
An important point. For the user, it’s all about “getting information”. In SEO, verticals are often overlooked.
Client Selection & Getting Seen In The Right Places
I’m going to digress a little….how do you select clients, or areas to target?
I like to start from the audience side of the equation. Who are the intended audience, what does that audience really need, and where, on the web, are they? I then determine if it’s possible/plausible to position well for this intended audience within a given budget.
There is much debate amongst SEOs about what happens inside the Google black box, but we all have access to Google’s actual output in the form of search results. To determine the level of competition, examine the search results. Go through the top ten or twenty results for a few relevant keywords and see which sites Google favors, and try to work out why.
Once you look through the results and analyze the competition, you’ll get a good feel for what Google likes to see in that specific sector. Are the search results heavy on long-form information? Mostly commercial entities? Are sites large and established? New and up and coming? Do the top sites promote visitor engagement? Who links to them and why? Is there a lot news mixed in? Does it favor recency? Are Google pulling results from industry verticals?
It’s important to do this analysis for each project, rather than rely on prescriptive methods. Why? Because Google treats sectors differently. What works for “travel” SEO may not work for “casino” SEO because Google may be running different algorithms.
Once you weed out the wild speculation about algorithms, SEO discussion can contain much truth. People convey their direct experience and will sometimes outline the steps they took to achieve a result. However, often specific techniques aren’t universally applicable due to Google treating topic areas differently. So spend a fair bit of time on competitive analysis. Look closely at the specific results set you’re targeting to discover what is really working for that sector, out in the wild.
It’s at this point where you’ll start to see cross-overs between search and content placement.
The Role Of Verticals
You could try and rank for term X, and you could feature on a site that is already ranked for X. Perhaps Google is showing a directory page or some industry publication. Can you appear on that directory page or write an article for this industry publication? What does it take to get linked to by any of these top ten or twenty sites?
Once search visitors find that industry vertical, what is their likely next step? Do they sign up for a regular email? Can you get placement on those emails? Can you get an article well placed in some evergreen section on their site? Can you advertise on their site? Figure out how visitors would engage with that site and try to insert yourself, with grace and dignity, into that conversation.
Users may by-pass Google altogether and go straight to verticals. If they like video then YouTube is the obvious answer. A few years ago when Google was pushing advertisers to run video ads they pitched YouTube as the #2 global search engine. What does it take to rank in YouTube in your chosen vertical? Create videos that will be found in YouTube search results, which may also appear on Google’s main search results.
With 200,000 videos uploaded per day, more than 600 years required to view all those videos, more than 100 million videos watched daily, and more than 300 million existing accounts, if you think YouTube might not be an effective distribution channel to reach prospective customers, think again.
There’s a branding parallel here too. If the field of SEO is too crowded, you can brand yourself as the expert in video SEO.
There’s also the ubiquitous Facebook.
Facebook, unlike the super-secret Google, has shared their algorithm for ranking content on Facebook and filtering what appears in the news feed. The algorithm consists of three components…..
If you’re selling stuff, then are you on Amazon? Many people go directly to Amazon to begin product searches, information gathering and comparisons. Are you well placed on Amazon? What does it take to be placed well on Amazon? What are people saying? What are their complaints? What do they like? What language do they use?
In 2009, nearly a quarter of shoppers started research for an online purchase on a search engine like Google and 18 percent started on Amazon, according to a Forrester Research study. By last year, almost a third started on Amazon and just 13 percent on a search engine. Product searches on Amazon have grown 73 percent over the last year while searches on Google Shopping have been flat, according to comScore
All fairly obvious, but may help you think about channels and verticals more, rather than just Google. The appropriate verticals and channels will be different for each market sector, of course. And they change over time as consumer tastes & behaviors change. At some point each of these were new: blogging, Friendster, MySpace, Digg, Facebook, YouTube, Twitter, LinkedIn, Instagram, Pinterest, Snapchat, etc.
This approach will also help us gain a deeper understanding of the audience and their needs – particularly the language people use, the questions they ask, and the types of things that interest them most – which can then be fed back into your search strategy. Emulate whatever works in these verticals. Look to create a unique, deep collection of insights about your chosen keyword area. This will in turn lead to strategic advantage, as your competition is unlikely to find such specific information pre-packaged.
This could also be characterised as “content marketing”, which it is, although I like to think of it all as “getting in front of the visitors quest for information”. Wherever the visitors are, that’s where you go, and then figure out how to position well in that space.
The Only Thing Certain In SEO Is Change
SEO is subject to frequent change, but in the last year or two, the changes feel both more frequent and significant than changes in the past. Florida hit in 2003. Since then, it’s like we get a Florida every six months.
Whenever Google updates the underlying landscape, the strategies need to change in order to deal with it. No fair warning. That’s not the game.
From Tweaks To Strategy
There used to be a time when SEOs followed a standard prescription. Many of us remember a piece of software called Web Position Gold.
Web Position Gold emerged when SEO could be reduced to a series of repeatable – largely technical – steps. Those steps involved adding keywords to a page, repeating those keywords in sufficient density, checking a few pieces of markup, then scoring against an “ideal” page. Upload to web. Add a few links. Wait a bit. Run a web ranking report. Viola! You’re an SEO. In all but the most competitive areas, this actually worked.
Seems rather quaint these days.
These days, you could do all of the above and get nowhere. Or you might get somewhere, but when so many more factors in play, they can’t be isolated to an individual page score. If the page is published on a site with sufficient authority, it will do well almost immediately. If it appears on a little known site, it may remain invisible for a long time.
Before Google floated in 2004, they released an investor statement signalling SEO – well, “index spammers” – as a business risk. If you ever want to know what Google really feels about people who “manipulate” their results, it’s right here:
We are susceptible to index spammers who could harm the integrity of our web search results.
There is an ongoing and increasing effort by “index spammers” to develop ways to manipulate our web search results. For example, because our web search technology ranks a web page’s relevance based in part on the importance of the web sites that link to it, people have attempted to link a group of web sites together to manipulate web search results. We take this problem very seriously because providing relevant information to users is critical to our success. If our efforts to combat these and other types of index spamming are unsuccessful, our reputation for delivering relevant information could be diminished. This could result in a decline in user traffic, which would damage our business.
SEO competes with the Adwords business model. So, Google “take very seriously” the activities of those who seek to figure out the algorithms, reverse engineer them, and create push-button tools like Web Position Gold. We’ve had Florida, and Panda, and Penguin, and Hummingbird, all aimed at making the search experience better for users, whilst having the pleasant side effect, as far as Google is concerned, of making life more difficult for SEOs.
I think the key part of Google’s statement was “delivering relevant information”.
From Technical Exercise To PR
SEO will always involve technical aspects. You get down into code level and mark it up. The SEO needs to be aware of development and design and how those activities can affect SEO. The SEO needs to know how web servers work, and how spiders can sometimes fail to deal with their quirks.
But in the years since Florida, marketing aspects have become more important. An SEO can perform the technical aspects of SEO and get nowhere. More recent algorithms, such as Panda and Penguin, gauge the behaviour of users, as Google tries to determine information quality of pages. Hummingbird attempts to discover the intent that lays behind keywords.
As a result, Keyword-based SEO is in the process of being killed off. Google withholds keyword referrer data and their various algorithms attempt to deliver pages based on a users intent and activity – both prior and present – in order to deliver relevant information. Understanding the user, having a unique and desirable offering, and a defensible market position is more important than any keyword markup. The keyword match, on which much SEO is based, is not an approach that is likely to endure.
The emphasis has also shifted away from the smaller operators and now appears to favour brands. This occurs not because brands are categorized as “brands”, but due to the side effects of significant PR activities. Bigger companies tend to run multiple advertising and PR campaigns, so produce signals Google finds favorable i.e. search volume on company name, semantic associations with products and services, frequent links from reputable media, and so on. This flows through into rank. And it also earns them leeway when operating in the gray area where manual penalties are handed out to smaller & weaker entities for the same activities.
Rankings
Apparently, Google killed off toolbar PageRank.
We will probably not going to be updating it [PageRank] going forward, at least in the Toolbar PageRank.
A few people noted it, but the news won’t raise many eyebrows as toolbar PR has long since become meaningless. Are there any SEOs altering what they do based on toolbar PR? It’s hard to imagine why. The reality is that an external PR value might indicate an approximate popularity level, but this isn’t an indicator of the subsequent ranking a link from such a page will deliver. There are too many other factors involved. If Google are still using an internal PR metric, it’s likely to be a significantly more complicated beast than was revealed in 1997.
A PageRank score is a proxy for authority. I’m quite sure Google kept it going as an inside joke.
A much more useful proxy for authority are the top ten pages in any niche. Google has determined all well-ranking pages have sufficient authority, and no matter what the toolbar, or any other third-party proxy, says, it’s Google’s output that counts. A link from any one of the top ten pages will likely confer a useful degree of authority, all else being equal. It’s good marketing practice to be linked from, and engage with, known leaders in your niche. That’s PR, as in public relations thinking, vs PR (Page rank), technical thinking.
The next to go will likely be keyword-driven SEO. Withholding keyword referral data was the beginning of the end. Hummingbird is hammering in the nails. Keywords are still great for research purposes – to determine if there’s an audience and what the size of that audience may be – but SEO is increasingly driven by semantic associations and site categorizations. It’s not enough to feature a keyword on a page. A page, and site, needs to be about that keyword, and keywords like it, and be externally recognized as such. In the majority of cases, a page needs to match user intent, rather than just a search term. There are many exceptions, of course, but given what we know about Hummingbird, this appears to be the trend.
People will still look at rank, and lust after prize keywords, but really, rankings have been a distraction all along. Reach and specificity is more important i.e. where’s the most value coming from? The more specific the keyword, typically the lower the bounce rate and the higher the conversion rate. The lower the bounce-rate, and higher the conversion rate, the more positive signals the site will generate, which will flow back into a ranking algorithm increasing being tuned for engagement. Ranking for any keyword that isn’t delivering business value makes no sense.
There are always exceptions. But that’s the trend. Google are looking for pages that match user intent, not just pages that match a keyword term. In terms of reach, you want to be everywhere your customers are.
Search Is The Same, But Different
To adapt to change, SEOs should think about search in the widest possible terms. A search is quest for information. It may be an active, self-directed search, in the form of a search engine query. Or a more passive search, delivered via social media subscriptions and the act of following. How will all these activities feed into your search strategy?
Sure, it’s not a traditional definition of SEO, as I’m not limiting it to search engines. Rather, my point is about the wider quest for information. People want to find things. Eric Schmidt recently claimed Amazon is Google’s biggest competitor in search. The mechanisms and channels may change, but the quest remains the same. Take, for example, the changing strategy of BuzzFeed:
Soon after Peretti had turned his attention to BuzzFeed full-time in 2011, after leaving the Huffington Post, BuzzFeed took a hit from Google. The site had been trying to focus on building traffic from both social media marketing and through SEO. But the SEO traffic — the free traffic driven from Google’s search results — dried up.
Reach is important. Topicality is important. Freshness, in most cases, is important. Engagement is important. Finding information is not just about a technical match of a keyword, it’s about an intellectual match of an idea. BuzzFeed didn’t take their eye off the ball. They know helping users find information is the point of the game they are in.
And the internet has only just begun.
In terms of the internet, nothing has happened yet. The internet is still at the beginning of its beginning. If we could climb into a time machine and journey 30 years into the future, and from that vantage look back to today, we’d realize that most of the greatest products running the lives of citizens in 2044 were not invented until after 2014. People in the future will look at their holodecks, and wearable virtual reality contact lenses, and downloadable avatars, and AI interfaces, and say, oh, you didn’t really have the internet (or whatever they’ll call it) back then.
In 30 years time, people will still be on the exact same quest for information. The point of SEO has always been to get your information in front of visitors, and that’s why SEO will endure. SEO was always a bit of a silly name, and it often distracts people from the point, which is to get your stuff seen ahead of the rest.
Some SEOs have given up in despair because it’s not like the old days. It’s becoming more expensive to do effective SEO, and the reward may not be there, especially for smaller sites. However, this might be to miss the point, somewhat.
The audience is still there. Their needs haven’t changed. They still want to find stuff. If SEO is all about helping users find stuff, then that’s the important thing. Remember the “why”. Adapt the “how”
In the next few articles, we’ll look at the specifics of how.
Measuring SEO Performance After “Not Provided”
In recent years, the biggest change to the search landscape happened when Google chose to withhold keyword data from webmasters. At SEOBook, Aaron noticed and wrote about the change, as evermore keyword data disappeared.
The motivation to withold this data, according to Google, was privacy concerns:
SSL encryption on the web has been growing by leaps and bounds. As part of our commitment to provide a more secure online experience, today we announced that SSL Search on https://www.google.com will become the default experience for signed in users on google.com.
At first, Google suggested it would only affect a single-digit percentage of search referral data:
Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com
…which didn’t turn out to be the case. It now affects almost all keyword referral data from Google.
Was it all about privacy? Another rocket over the SEO bows? Bit of both? Probably. In any case, the search landscape was irrevocably changed. Instead of being shown the keyword term the searcher had used to find a page, webmasters were given the less than helpful “not provided”. This change rocked SEO. The SEO world, up until that point, had been built on keywords. SEOs choose a keyword. They rank for the keyword. They track click-thrus against this keyword. This is how many SEOs proved their worth to clients.
These days, very little keyword data is available from Google. There certainly isn’t enough to keyword data to use as a primary form of measurement.
Rethinking Measurement
This change forced a rethink about measurement, and SEO in general. Whilst there is still some keyword data available from the likes of Webmaster Tools & the AdWords paid versus organic report, keyword-based SEO tracking approaches are unlikely to align with Google’s future plans. As we saw with the Hummingbird algorithm, Google is moving towards searcher-intent based search, as opposed to keyword-matched results.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words
The search bar is still keyword based, but Google is also trying to figure out what user intent lays behind the keyword. To do this, they’re relying on context data. For example, they look at what previous searches has the user made, their location, they are breaking down the query itself, and so on, all of which can change the search results the user sees.
When SEO started, it was in an environment where the keyword the user typed into a search bar was exact matching that with a keyword that appears on a page. This is what relevance meant. SEO continued with this model, but it’s fast becoming redundant, because Google is increasingly relying on context in order to determine searcher intent & while filtering many results which were too aligned with the old strategy. Much SEO has shifted from keywords to wider digital marketing considerations, such as what the visitor does next, as a result.
We’ve Still Got Great Data
Okay, if SEO’s don’t have keywords, what can they use?
If we step back a bit, what we’re really trying to do with measurement is demonstrate value. Value of search vs other channels, and value of specific search campaigns. Did our search campaigns meet our marketing goals and thus provide value?
Do we have enough data to demonstrate value? Yes, we do. Here are a few ideas SEOs have devised to look at the organic search data they are getting, and they use it to demonstrate value.
1. Organic Search VS Other Activity
If our organic search tracking well when compared with other digital marketing channels, such as social or email? About the same? Falling?
In many ways, the withholding of keyword data can be a blessing, especially to those SEOs who have a few ranking-obsessed clients. A ranking, in itself is worthless, especially if it’s generating no traffic.
Instead, if we look at the total amount of organic traffic, and see that it is rising, then we shouldn’t really care too much about what keywords it is coming from. We can also track organic searches across device, such as desktop vs mobile, and get some insight into how best to optimize those channels for search as a whole, rather than by keyword. It’s important that the traffic came from organic search, rather than from other campaigns. It’s important that the visitors saw your site. And it’s important what that traffic does next.
2. Bounce Rate
If a visitor comes in, doesn’t like what is on offer, and clicks back, then that won’t help rankings. Google have been a little oblique on this point, saying they aren’t measuring bounce rate, but I suspect it’s a little more nuanced, in practice. If people are failing to engage, then anecdotal evidence suggests this does affect rankings.
Look at the behavioral metrics in GA; if your content has 50% of people spending less than 10 seconds, that may be a problem or that may be normal. The key is to look below that top graph and see if you have a bell curve or if the next largest segment is the 11-30 second crowd.
Either way, we must encourage visitor engagement. Even small improvements in terms of engagement can mean big changes in the bottom line. Getting visitors to a site was only ever the first step in a long chain. It’s what they do next that really makes or breaks a web business, unless the entire goal was that the visitor should only view the landing page. Few sites, these days, would get much return on non-engagement.
PPCers are naturally obsessed with this metric, because each click is costing them money, but when you think about it, it’s costing SEOs money, too. Clicks are getting harder and harder to get, and each click does have a cost associated with it i.e. the total cost of the SEO campaign divided by the number of clicks, so each click needs to be treated as a cost.
3. Landing Pages
We can still do landing page analysis. We can see the pages where visitors are entering the website. We can also see which pages are most popular, and we can tell from the topic of the page what type of keywords people are using to find it.
We could add more related keyword to these pages and see how they do, or create more pages on similar themes, using different keyword terms, and then monitor the response. Similarly, we can look at poorly performing pages and make the assumption these are not ranking against intended keywords, and mark these for improvement or deletion.
We can see how old pages vs new pages are performing in organic search. How quickly do new pages get traffic?
We’re still getting a lot of actionable data, and still not one keyword in sight.
4. Visitor And Customer Acquisition Value
We can still calculate the value to the business of an organic visitor.
We can also look at what step in the process are organic visitors converting. Early? Late? Why? Is there some content on the site that is leading them to convert better than other content? We can still determine if organic search provided a last click-conversion, or a conversion as the result of a mix of channels, where organic played a part. We can do all of this from aggregated organic search data, with no need to look at keywords.
5. Contrast With PPC
We can contrast Adwords data back against organic search. Trends we see in PPC might also be working in organic search.
For AdWords our life is made infinitesimally easier because by linking your AdWords account to your Analytics account rich AdWords data shows up automagically allowing you to have an end-to-end view of campaign performance.
Even PPC-ers are having to change their game around keywords:
The silver lining in all this? With voice an mobile search, you’ll likely catch those conversions that you hadn’t before. While you may think that you have everything figured out and that your campaigns are optimal, this matching will force you into deeper dives that hopefully uncover profitable PPC pockets.
6. Benchmark Against Everything
In the above section I highlighted comparing organic search to AdWords performance, but you can benchmark against almost any form of data.
Is 90% of your keyword data (not provided)? Then you can look at the 10% which is provided to estimate performance on the other 90% of the traffic. If you get 1,000 monthly keyword visits for [widgets], then as a rough rule of thumb you might get roughly 9,000 monthly visits for that same keyword shown as (not provided).
Has your search traffic gone up or down over the past few years? Are there seasonal patterns that drive user behavior? How important is the mobile shift in your market? What landing pages have performed the best over time and which have fallen hardest?
How is your site’s aggregate keyword ranking profile compared to top competitors? Even if you don’t have all the individual keyword referral data from search engines, seeing the aggregate footprints, and how they change over time, indicates who is doing better and who gaining exposure vs losing it.
Numerous competitive research tools like SEM Rush, SpyFu & SearchMetrics provide access to that type of data.
You can also go further with other competitive research tools which look beyond the search channel. Is most of your traffic driven from organic search? Do your competitors do more with other channels? A number of sites like Compete.com and Alexa have provided estimates for this sort of data. Another newer entrant into this market is SimilarWeb.
And, finally, rank checking still has some value. While rank tracking may seem futile in the age of search personalization and Hummingbird, it can still help you isolate performance issues during algorithm updates. There are a wide variety of options from browser plugins to desktop software to hosted solutions.
By now, I hope I’ve convinced you that specific keyword data isn’t necessary and, in some case, may have only served to distract some SEOs from seeing other valuable marketing metrics, such as what happens after the click and where do they go next.
So long as the organic search traffic is doing what we want it to, we know which pages it is coming in on, and can track what it does next, there is plenty of data there to keep us busy. Lack of keyword data is a pain, but in response, many SEOs are optimizing for a lot more than keywords, and focusing more on broader marketing concerns.
Further Reading & Sources:
Loah Qwality Add Werds Clix Four U
Google recently announced they were doing away with exact match AdWords ad targeting this September. They will force all match types to have close variant keyword matching enabled. This means you get misspelled searches, plural versus singular overlap, and an undoing of your tight organization.
In some cases the user intent is different between singular and plural versions of a keyword. A singular version search might be looking to buy a single widget, whereas a plural search might be a user wanting to compare different options in the marketplace. In some cases people are looking for different product classes depending on word form:
For example, if you sell spectacles, the difference between users searching on ‘glass’ vs. ‘glasses’ might mean you are getting users seeing your ad interested in a building material, rather than an aid to reading.
Where segmenting improved the user experience, boosted conversion rates, made management easier, and improved margins – those benefits are now off the table.
CPC isn’t the primary issue. Profit margins are what matter. Once you lose the ability to segment you lose the ability to manage your margins. And this auctioneer is known to bid in their own auctions, have random large price spikes, and not give refunds when they are wrong.
An offline analogy for this loss of segmentation … you go to a gas station to get a bottle of water. After grabbing your water and handing the cashier a $20, they give you $3.27 back along with a six pack you didn’t want and didn’t ask for.
Why does a person misspell a keyword? Some common reasons include:
- they are new to the market & don’t know it well
- they are distracted
- they are using a mobile device or something which makes it hard to input their search query (and those same input issues make it harder to perform other conversion-oriented actions)
- their primary language is a different language
- they are looking for something else
In any of those cases, the typical average value of the expressed intent is usually going to be less than a person who correctly spelled the keyword.
Even if spelling errors were intentional and cultural, the ability to segment that and cater the landing page to match disappears. Or if the spelling error was a cue to send people to an introductory page earlier in the conversion funnel, that option is no more.
In many accounts the loss of the granular control won’t cause too big of a difference. But some advertiser accounts in competitive markets will become less profitable and more expensive to manage:
No one who’s in the know has more than about 5-10 total keywords in any one adgroup because they’re using broad match modified, which eliminated the need for “excessive keyword lists” a long time ago. Now you’re going to have to spend your time creating excessive negative keyword lists with possibly millions upon millions of variations so you can still show up for exactly what you want and nothing else.
You might not know which end of the spectrum your account is on until disaster strikes:
I added negatives to my list for 3 months before finally giving up opting out of close variants. What they viewed as a close variant was not even in the ballpark of what I sell. There have been petitions before that have gotten Google to reverse bad decisions in the past. We need to make that happen again.
Brad Geddes has held many AdWords seminars for Google. What does he think of this news?
In this particular account, close variations have much lower conversion rates and much higher CPAs than their actual match type.
…
Variation match isn’t always bad, there are times it can be good to use variation match. However, there was choice.
…
Loss of control is never good. Mobile control was lost with Enhanced Campaigns, and now you’re losing control over your match types. This will further erode your ability to control costs and conversions within AdWords.
A monopoly restricting choice to enhance their own bottom line. It isn’t the first time they’ve done that, and it won’t be the last.
Have an enhanced weekend!
Understanding The Google Penguin Algorithm
Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.
The two people I chat most with during periods of big a…
Guide To Optimizing Client Sites 2014
For those new to optimizing clients sites, or those seeking a refresher, we thought we’d put together a guide to step you through it, along with some selected deeper reading on each topic area.
Every SEO has different ways of doing things, but we’ll cover the aspects that you’ll find common to most client projects.
Few Rules
The best rule I know about SEO is there are few absolutes in SEO. Google is a black box, so complete data sets will never be available to you. Therefore, it can be difficult to pin down cause and effect, so there will always be a lot of experimentation and guesswork involved. If it works, keep doing it. If it doesn’t, try something else until it does.
Many opportunities tend to present themselves in ways not covered by “the rules”. Many opportunities will be unique and specific to the client and market sector you happen to be working with, so it’s a good idea to remain flexible and alert to new relationship and networking opportunities. SEO exists on the back of relationships between sites (links) and the ability to get your content remarked upon (networking).
When you work on a client site, you will most likely be dealing with a site that is already established, so it’s likely to have legacy issues. The other main challenge you’ll face is that you’re unlikely to have full control over the site, like you would if it were your own. You’ll need to convince other people of the merit of your ideas before you can implement them. Some of these people will be open to them, some will not, and some can be rather obstructive. So, the more solid data and sound business reasoning you provide, the better chance you have of convincing people.
The most important aspect of doing SEO for clients is not blinding them with technical alchemy, but helping them see how SEO provides genuine business value.
1. Strategy
The first step in optimizing a client site is to create a high-level strategy.
“Study the past if you would define the future.” – Confucious
You’re in discovery mode. Seek to understand everything you can about the clients business and their current position in the market. What is their history? Where are they now and where do they want to be? Interview your client. They know their business better than you do and they will likely be delighted when you take a deep interest in them.
- What are they good at?
- What are their top products or services?
- What is the full range of their products or services?
- Are they weak in any areas, especially against competitors?
- Who are their competitors?
- Who are their partners?
- Is their market sector changing? If so, how? Can they think of ways in which this presents opportunities for them?
- What keyword areas have worked well for them in the past? Performed poorly?
- What are their aims? More traffic? More conversions? More reach? What would success look like to them?
- Do they have other online advertising campaigns running? If so, what areas are these targeting? Can they be aligned with SEO?
- Do they have offline presence and advertising campaigns? Again, what areas are these targeting and can they be aligned with SEO?
Some SEO consultants see their task being to gain more rankings under an ever-growing list of keywords. Ranking for more keywords, or getting more traffic, may not result in measurable business returns as it depends on the business and the marketing goals. Some businesses will benefit from honing in on specific opportunities that are already being targeted, others will seek wider reach. This is why it’s important to understand the business goals and market sector, then design the SEO campaign to support the goals and the environment.
This type of analysis also provides you with leverage when it comes to discussing specific rankings and competitor rankings. The SEO can’t be expected to wave a magic wand and place a client top of a category in which they enjoy no competitive advantage. Even if the SEO did manage to achieve this feat, the client may not see much in the way of return as it’s easy for visitors to click other listings and compare offers.
Understand all you can about their market niche. Look for areas of opportunity, such as changing demand not being met by your client or competitors. Put yourself in their customers shoes. Try and find customers and interview them. Listen to the language of customers. Go to places where their customers hang out online. From the customers language and needs, combined with the knowledge gleaned from interviewing the client, you can determine effective keywords and themes.
Document. Get it down in writing. The strategy will change over time, but you’ll have a baseline point of agreement outlining where the site is at now, and where you intend to take it. Getting buy-in early smooths the way for later on. Ensure that whatever strategy you adopt, it adds real, measurable value by being aligned with, and serving, the business goals. It’s on this basis the client will judge you, and maintain or expand your services in future.
Further reading:
– 4 Principles Of Marketing Strategy In The Digital Age
– Product Positioning In Five Easy Steps [pdf]
– Technology Marketers Need To Document Their Marketing Strategy
2. Site Audit
Sites can be poorly organized, have various technical issues, and missed keyword opportunities.
We need to quantify what is already there, and what’s not there.
- Use a site crawler, such as Xenu Link Sleuth, Screaming Frog or other tools that will give you a list of URLs, title information, link information and other data.
- Make a list of all broken links.
- Make a list of all orphaned pages
- Make a list of all pages without titles
- Make a list of all pages with duplicate titles
- Make a list of pages with weak keyword alignment
- Crawl robots txt and hand-check. It’s amazing how easy it is to disrupt crawling with a robots.txt file
Broken links are a low-quality signal. It’s debatable if they are a low quality signal to Google, but certainly to users. If the client doesn’t have one already, implement a system whereby broken links are checked on a regular basis. Orphaned pages are pages that have no links pointing to them. Those pages may be redundant, in which case they should be removed, or you need to point inbound links at them, so they can be crawled and have more chance of gaining rank. Page titles should be unique, aligned with keyword terms, and made attractive in order to gain a click. A link is more attractive if it speaks to a customer need. Carefully check robots.txt to ensure it’s not blocking areas of the site that need to be crawled.
As part of the initial site audit, it might make sense to include the site in Google Webmaster Tools to see if it has any existing issues there and to look up its historical performance on competitive research tools to see if the site has seen sharp traffic declines. If they’ve had sharp ranking and traffic declines, pull up that time period in their web analytics to isolate the date at which it happened, then look up what penalties might be associated with that date.
Further Reading:
– Broken Links, Pages, Images Hurt SEO
– Three Easy Ways To Fix Broken Links And Stop Unnecessary Visitor Loss
– 55 Ways To Use Screaming Frog
– Robots.txt Tutorial
3. Competitive Analysis
Some people roll this into a site audit, but I’ll split it out as we’re not looking at technical issues on competitor sites, we’re looking at how they are positioned, and how they’re doing it. In common with a site audit, there’s some technical reverse engineering involved.
There are various tools that can help you do this. I use SpyFu. One reporting aspect that is especially useful is estimating the value of the SEO positions vs the Adwords positions. A client can then translate the ranks into dollar terms, and justify this back against your fee.
When you run these competitive reports, you can see what content of theirs is working well, and what content is gaining ground. Make a list of all competitor content that is doing well. Examine where their links are coming from, and make a list. Examine where they’re mentioned in the media, and make a list. You can then use a fast-follow strategy to emulate their success, then expand upon it.
Sometimes, “competitors”, meaning ranking competitors, can actually be potential partners. They may not be in the same industry as your client, just happen to rank in a cross-over area. They may be good for a link, become a supplier, welcome advertising on their site, or be willing to place your content on their site. Make a note of the sites that are ranking well within your niche, but aren’t direct competitors.
Using tools that estimate the value of ranks by comparing Adwords keywords prices, you can estimate the value of your competitors positions. If your client appears lower than the competition, you can demonstrate the estimated dollar value of putting time and effort into increasing rank. You can also evaluate their rate of improvement over time vs your client, and use this as a competitive benchmark. If your client is not putting in the same effort as your competitor, they’ll be left behind. If their competitors are spending on ongoing-SEO and seeing tangible results, there is some validation for your client to do likewise.
Further reading:
– Competitor Analysis [pdf]
– Illustrated SEO Competitive Workflow
– Competitive Analysis: How To Become A SEO Hero In 4 Steps
4. Site Architecture
A well organised site is both useful from a usability standpoint and an SEO standpoint. If it’s clear to a user where they need to go next, then this will flow through into better engagement scores. If your client has a usability consultant on staff, this person is a likely ally.
It’s a good idea to organise a site around themes. Anecdotal evidence suggests that Google likes pages grouped around similar topics, rather than disparate topics (see from 1.25 onwards).
- Create spreadsheet based on a crawl after any errors have been tidied up
- Identify best selling products and services. These deserve the most exposure and should be placed high up the site hierarchy. Items and categories that do not sell well, and our less strategically important, should be lower in the hierarchy
- Pages that are already getting a lot of traffic, as indicated by your analytics, might deserve more exposure by moving them up the hierarchy.
- Seasonal products might deserve more exposure just before that shopping season, and less exposure when the offer is less relevant.
- Group pages into similar topics, where possible. For example, acme.com/blue-widgets/ , acme.com/green-widgets/.
- Determine if internal anchor text is aligned with keyword titles and page content by looking at a backlink analysis
A spreadsheet of all pages helps you group pages thematically, preferably into directories with similar content. Your strategy document will guide you as to which pages you need to work on, and which pages you need to religate. Some people spend a lot of time sculpting internal pagerank i.e. flowing page rank to some pages, but using nofollow on other links to not pass link equity to others. Google may have depreciated that approach, but you can still link to important products or categories sitewide to flow them more link equity, while putting less important sites lower in the site’s architecture. Favour your money pages, and relegate your less important pages.
Think mobile. If your content doesn’t work on mobile, then getting to the top of search results won’t do you much good.
Further Reading:
– Site Architecture & Search Engine Success Factors
– Optimiing Your Websites Architecture For SEO (Slide Presentation)
– The SEO Guide To Information Archetecture
5. Enable Crawling & Redirects
Ensure your site is deep crawled. To check if all your URLs are included in Google’s index, sign up with Webmaster Tools and/or other index reporting tools.
- Include a site map
- Check the existing robots.txt. Kep robots out of non-essential areas, such as script repositories and other admin related directories.
- If you need to move pages, or you have links to pages that no longer exist, use page redirects to tidy them up
- Make a list of 404 errors. Make sure the 404 page has useful navigation into the site so visitors don’t click back.
The accepted method to redirect a page is to use a 301. The 301 indicates a page has permanently moved location. A redirect is also useful if you change domains, or if you have links pointing to different versions of the site. For example, Google sees http://www.acme.com and http://acme.com as different sites. Pick one and redirect to it.
Here’s a video explaining how:
If you don’t redirect pages, then you won’t be making full use of any link juice allocated to those pages.
Further Reading:
– What Are Google Site Maps?
– The Ultimate Guide To 301 Redirects
– Crawling And Indexing Metrics
6. Backlink Analysis
Backlinks remain a major ranking factor. Generally, the more high quality links you have pointing to your site, the better you’ll do in the results. Of late, links can also harm you. However, if your overall link profile is strong, then a subset of bad links is unlikely to cause you problems. A good rule of thumb is the Matt Cutts test. Would you be happy to show the majority of your links to Matt Cutts? :) If not, you’re likely taking a high risk strategy when it comes to penalties. These can be manageable when you own the site, but they can be difficult to deal with on client sites, especially if the client was not aware of the risks involved in aggressive SEO.
- Establish a list of existing backlinks. Consider trying to remove any that look low quality.
- Ensure all links resolve to appropriate pages
- Draw up a list of sites from which your main competitors have gained links
- Draw up a list of sites where you’d like to get links from
Getting links involves either direct placement or being linkworthy. On some sites, like industry directories, you can pay to appear. In other cases, it’s making your site into an attractive linking target.
Getting links to purely commercial sites can be a challenge. Consider sponsoring charities aligned with your line of business. Get links from local chambers of commerce. Connect with education establishments who are doing relevant research and consider sponsoring or become involved in some way.
Look at the sites that point to your competitors. How were these links obtained? Follow the same path. If they successfully used white papers, then copy that approach. If they successfully used news, do that, too. Do whatever seems to work for others. Evaluate the result. Do more/less of it, depending on the results.
You also need links from sites that your competitors don’t have. Make a list of desired links. Figure out a strategy to get them. It may involve supplying them with content. It might involve participating in their discussions. It may involve giving them industry news. It might involve interviewing them or profiling them in some way, so they link to you. Ask “what do they need”?. Then give it to them.
Of course, linking is an ongoing strategy. As a site grows, many links will come naturally, and that in itself, is a link acquisition strategy. To grow in importance and consumer interest relative to the competition. This involves your content strategy. Do you have content that your industry likes to link to? If not, create it. If your site is not something that your industry links to, like a brochure site, you may look at spinning-off a second site that is information focused, and less commercial focused. You sometimes see blogs on separate domains where employees talk about general industry topics, like Signal Vs Noise, Basecamps blog. These are much more likely to receive links than sites that are purely commercial in nature.
Before chasing links, you should be aware of what type of site typically receives links, and make sure you’re it.
Further Reading:
– Interview Of Debra Mastaler, the Link Guru
– Scaleable Link Building Techniques
– Creative Link Building Ideas
7 Content Assessment
Once you have a list of keywords, an idea of where competitors rank, and what the most valuable terms are from a business point of view, you can set about examining and building out content.
Do you have content to cover your keyword terms? If not, add it to the list of content that needs to be created. If you have content that matches terms, see if compares well with client content on the same topic. Can the pages be expanded or made more detailed? Can more/better links be added internally? Will the content benefit from amalgamating different content types i.e. videos, audio, images et al?
You’ll need to create content for any keyword areas you’re missing. Rather than copy what is already available in the niche, look at the best ranking/most valuable content for that term and ask how it could be made better. Is there new industry analysis or reports that you can incorporate and/or expand on? People love the new. They like learning things they don’t already know. Mee-too content can work, but it’s not making the most of the opportunity. Aim to produce considerably more valuable content than already exists as you’ll have more chance of getting links, and more chance of higher levels of engagement when people flip between sites. If visitors can get the same information elsewhere, they probably will.
Consider keyword co-occurrence. What terms are readily associated with the keywords you’re chasing? Various tools provide this analysis, but you can do it yourself using the Adwords research tool. See what keywords it associates with your keywords. The Google co-occurrence algorithm is likely the same for both Adwords and organic search.
Also, think about how people will engage with your page. Is it obvious what the page is about? Is it obvious what the user must do next? Dense text and distracting advertising can reduce engagement, so make sure the usability is up to scratch. Text should be a reasonable size so the average person isn’t squinting. It should be broken up with headings and paragraphs. People tend to scan when reading online,searching for immediate confirmation they’ve found the right information. This was written a long time ago, but it’s interesting how relevant it remains.
Further Reading:
– Content Marketing Vs SEO
– Content Analysis Using Google Analytics
– Content Based SEO Strategy Will Eventually Fail
8. Link Out
Sites that don’t link out appear unnatural. Matt Cutts noted:
Of course, folks never know when we’re going to adjust our scoring. It’s pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.
- Make a list of all outbound links
- Determine if these links are complementary i.e. similar topic/theme, or related to the business in some way
- Make a list of pages with no links out
Links out are both a quality signal and good PR practise. Webmaster look at their inbound links, and will likely follow them back to see what is being said about them. That’s a great way to foster relationships, especially if your client’s site is relatively new. If you put other companies and people in a good light, you can expect many to reciprocate in kind.
Links, the good kind, are about human relationships.
It’s also good for your users. Your users are going to leave your site, one way or another, so you can pick up some kudos if you help them on their way by pointing them to some good authorities. If you’re wary about linking to direct competitors, then look for information resources, such as industry blogs or news sites, or anyone else you want to build a relationship with. Link to suppliers and related companies in close, but non-competing niches. Link to authoritative sites. Be very wary about pointing to low value sites, or sites that are part of link schemes. Low value sites are obvious. Sites that are part of link schemes are harder to spot, but typically feature link swapping schemes or obvious paid links unlikely to be read by visitors. Avoid link trading schemes. It’s too easy to be seen as a part of a link network, and it’s no longer 2002.
Further Resources:
– Five Reasons You Should Link Out
– The Domino Effects Of Links And Relationships
– Link Building 101: Utilizing Past Relationships
9. Ongoing
It’s not set and forget.
Clients can’t expect to do a one off optimisation campaign and expect it to keep working forever. It may be self-serving for SEOs to say it, but it’s also the truth. SEO is ongoing because search keeps changing and competitors and markets move. Few companies would dream of only having one marketing campaign. The challenge for the SEO, like any marketer, is to prove the on-going spend produces a return in value.
- Competition monitoring i.e. scan for changes in competitors rank, new competitors, and change of tactics. Determine what is working, and emulate it.
- Sector monitoring – monitor Google trends, keywords trends, discussion groups, and news releases. This will give you ideas for new campaign angles.
- Reporting – the client needs to be able to see the work you’ve done is paying off.
- Availability – clients will change things on their site, or bring in other marketers, so will want you advice going forward
Further Reading
Whole books can be written about SEO for clients. And they have. We’ve skimmed across the surface but, thankfully, there is a wealth of great information out there on the specifics of how to tackle each of these topic areas.
Perhaps you can weigh in? :) What would your advice be to those new to optimizing client sites? What do you wish someone had told you when you started?
Google Search Censorship for Fun and Profit
Growing Up vs Breaking Things
Facebook’s early motto was “move fast and break things,” but as they wanted to become more of a platform play they changed it to “move fast with stability.” Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.
As Google has become more dominant, they’ve moved in the opposite direction. Disruption is promoted as a virtue unto itself, so long as it doesn’t adversely impact the home team’s business model.
There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:
- we were REALLY wrong yesterday
- we are REALLY wrong today
Any change or disruption is easy to justify so long as you are not the one facing the consequences:
“Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything.” … “Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can’t hear that voice.” – Googler Avery Pennarun
Monopoly Marketshare in a Flash
Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.
Here’s the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).
Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.
Why doesn’t that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.
Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.
Google engineers “research” new forms of Flash security issues to drive critical security updates.
Obviously, users love it:
Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.
In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS’s web search.
In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.
Those “default” settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.
Google’s user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.
Locking Down The Ecosystem
And Chrome is easily the most locked down browser out there.
- Chromium is turning into abandonware, with Google stripping features to try to push people over to Chrome.
- Extensions must be installed from the official store. If those extensions deliver malware, no worries. But if those extensions are not aligned with Google’s business model – they will be banned until a commercial relationship aligned with Google’s business model is established. #censorship
- If someone other than Google changes default search settings, it’s time to reset hijacked settings.
- Chrome is so locked down that Yahoo! is canceling their search toolbar for Chrome to comply with recent Google Chrome policy updates, even as Google distributes toolbars in other browsers. #censorship
Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don’t lie.
I am frustrated @JohnMu saying that it will not cost CTR. Either Google lied about the increase in CTR with photos, or they’re lying now.— Rand Fishkin (@randfish) June 25, 2014
The Right to Be Forgotten
This brings us back to the current snafu with the “right to be forgotten” in Europe.
Google notified publishers like the BBC & The Guardian of their links being removed due to the EU “right to be forgotten” law. Their goal was to cause a public relations uproar over “censorship” which seems to have been a bit too transparent, causing them to reverse some of the removals after they got caught with their hand in the cookie jar.
The breadth of removals is an ongoing topic of coverage. But if you are Goldman Sachs instead of a government Google finds filtering information for you far more reasonable.
Some have looked at the EU policy and compared it to state-run censorship in China.
Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: “If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links.”
The World’s Richest Librarian
Google aims to promote themselves as a digital librarian: “It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue.”
That analogy is absurd on a number of levels. Which librarian…
- tracks people to target ads at them?
- blends ads into their recommendations so aggressively that most users are unable to distinguish the difference between ads and regular recommendations?
- republishes the works of others, offers ultimatums while taking third party content, and obscures or entirely strips the content source?
- invests in, funds & defunds entire lines of publishing?
- claims certain book publishers shall be banned from the library due to nothing other than their underlying business model?
Sorry About That Incidental Deletion From the Web…
David Drummond’s breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
Yet Google sends out hundreds of thousands of warning messages in webmaster tools every single month.
Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.
Despite Google’s great power they do make mistakes. And when they do, people lose their jobs.
Consider MetaFilter.
They were penalized November 17, 2012.
At a recent SMX conference Matt Cutts stated MetaFilter was a false positive.
People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.
As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.
MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.
The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.
If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.
And such stories are understated for fear of coverage creating a witch-hunt:
Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.
Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here’s a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.
Then there are areas like locksmiths:
I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that’s not calls I do, it’s calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars….a fake locksmith no doubt. She didn’t understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don’t know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I’m so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I’m failing at it.
There are entire sectors of the offline economy being reshaped by Google policies.
When those sectors get coverage, the blame always goes to the individual business owner who was personal responsible for Google’s behaviors, or perhaps some coverage of the nefarious “spammers.”
Never does anybody ask if it is reasonable for Google to place their own inaccurate $0 editorial front and center. To even bring up that issue makes one an anti-capitalist nut or someone who wishes to impede on free speech rights. This even after the process behind the sausage comes to light.
And while Google arbitrarily polices others, their leaked internal documents contain juicy quotes about their ad policies like:
- “We are the only player in our industry still accepting these ads”
- “We do not make these decisions based on revenue, but as background, [redacted].”
- “As with all of our policies, we do not verify what these sites actually do, only what they claim to do.”
- “I understand that we should not let other companies, press, etc. influence our decision-making around policy”
Is This “Censorship” Problem New?
This problem of control to access of information is nothing new – it is only more extreme today. Read the (rarely read) preface to Animal Farm, or consider this:
John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke’s writings about free speech. Locke’s boogeyman wasn’t an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn’t fit into its business model. Sound familiar?
When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.
“Policy is largely set by economic elites and organized groups representing business interests with little concern for public attitudes or public safety, as long as the public remains passive and obedient.” ― Noam Chomsky
Many people have come to the same conclusion
Turn on, tune in, drop out
“I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what’s the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don’t have mechanisms for that.” – Larry Page
I have no problem with an “opt-in” techno-utopia test in some remote corner of the world, but if that’s the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with “opting out.”
{This | The Indicated} {Just | True} {In | Newfangled}
A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.
Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:
We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.
And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
…
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.
In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:
you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.
One suspects these views do not apply to large politically connected media bodies like the AP, which are important enough to have a direct long-term deal with Google.
In the above announcement the AP announced they include automated NFL player rankings. One interesting thing to note about the AP is they have syndication deals with 1,400 daily newspapers nationwide, as well as thousands of TV and radio stations..
A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions … you get the idea.
To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:
“We are the largest producer of content in the world. That’s more than all media companies combined,” [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.
The Automated Insights homepage lists both Yahoo! & Microsoft as clients.
The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.
Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google’s good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire’s solution their penalty was greater emphasis on manual editorial review:
Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:
- Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
- Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
- Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
- Overuse of keywords and/or links within the message.
So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same “spammy” press releases using software to auto-generate articles based on them.
That makes sense & sounds totally reasonable, so long as you don’t actually think about it (or work at Google)…