Deeper Integration of Search Console in Google Analytics

(Cross-posted from the Google Analytics Blog.)
Google Analytics helps brands optimize their websites and marketing efforts for all sources of traffic, and Search Console is where website owners manage how they appear in Google organic search results. Today, we are introducing the ability to display Search Console metrics alongside Google Analytics metrics, in the same reports, side by side – giving you a full view of how your site shows up and performs in organic search results.

For years, users of both Search Console and Google Analytics have been able to link the two properties (instructions) and see Search Console statistics in Google Analytics, in isolation. But to gain a fuller picture of your website’s performance in organic search, it’s beneficial to see how visitors reached your site and what they did once they got there.

With this update, you’ll be able to see your Search Console metrics and your Google Analytics metrics in the same reports, in parallel. By combining data from both sources at the landing page level, we’re able to show you a full range of Acquisition, Behavior and Conversion metrics for your organic search traffic. This feature out is rolling out over the coming few weeks, so not everyone will see it immediately.


New Search Console reports combine Search Console and Google Analytics metrics

New Insights

The new reports allow you to examine your organic search data end-to-end and discover unique and actionable insights. Your Acquisition metrics from Search Console, such as impressions and average position, are now available in relation to your Behavior and Conversion metrics from Google Analytics, like bounce rate and pages per session.

Below are some new capabilities resulting from this improved integration:
 • Find landing pages that are attracting many users through Google organic search (e.g., high impressions and high click through rate) but where users are not engaging with the website. In this case, you should consider improving your landing pages.

 • Find landing pages that have high site engagement but are not successfully attracting users from Google organic search  (e.g., have low click through rate). In this case, you might benefit from improving titles and descriptions shown in search.

 • Learn which queries are ranking well for each organic landing page.

 • Segment organic performance by device category (desktop, tablet, mobile) in the new Devices report.
 


New Landing Page report showing Search Console and Google Analytics metrics

 

Additional Information 

Each of these new reports will display how your organic search traffic performs. As data is joined at the landing page level, Landing Pages, Countries and Devices will show both Search Console and Google Analytics data, while the Queries report will only show Search Console data for individual queries. The same search queries will display in Google Analytics as you see in Search Console today.

As mentioned in our Search Console Help Center, some data may not be displayed, to protect user privacy. For example, Search Console may not track some infrequent queries, and will not display those that include personal or sensitive information.

Also, while the data is displayed in parallel, not all Google Analytics features are available for Search Console data – including segmentation. Any segment that is applied to the new combined reports will only apply to Google Analytics data. You may also see that clicks from Search Console may differ from total sessions in Google Analytics.

 To experience the new combined reports from Search Console and Google Analytics, make sure your properties are linked, and then navigate to the new section “Search Console”, which should appear under “Acquisition” in the left-hand navigation in Google Analytics.

Posted by Joan Arensman, Product Manager, and Daniel Waisberg, Analytics Advocate

How we fought webspam in 2015

Search is a powerful tool. It helps people to find, share, and access an amazing wealth of content regardless of how they connect or where they are located. As part of Google’s search quality team, we work hard to ensure that searchers see high quality search results—and not webspam. We fight spam through a combination of algorithms and manual reviews to ensure that sites don’t rise in search results through deceptive or manipulative behavior, especially because those sites could harm or mislead users.

Below are some of the webspam insights we gathered in 2015, including trends we’ve seen, what we’re doing to fight spam and protect against those trends, and how we’re working with you to make the web better.

2015 webspam trends

  • We saw a huge number of websites being hacked – a 180% increase compared to the previous year. Stay safe on the web and take preventative measures to protect your content on the web.
  • We saw an increase in the number of sites with thin, low quality content. Such content contains little or no added value and is often scraped from other sites.

2015 spam-fighting efforts

  • As always, our algorithms addressed the vast majority of webspam and search quality improvement for users. One of our algorithmic updates helped to remove the amount of hacked spam in search results.
  • The rest of spam was tackled manually. We sent more than 4.3 million messages to webmasters to notify them of manual actions we took on their site and to help them identify the issues.
  • We saw a 33% increase in the number of sites that went through spam clean-up efforts towards a successful reconsideration process.

Working with users and webmasters for a better web

  • More than 400,000 spam reports were submitted by users around the world. After prioritizing the reports, we acted on 65% of them, and considered 80% of those acted upon to be spam. Thanks to all who submitted reports and contributed towards a cleaner web ecosystem!
  • We conducted more than 200 online office hours and live events around the world in 17 languages. These are great opportunities for us to help webmasters with their sites and for them to share helpful feedback with us as well.
  • The webmaster help forum continued to be an excellent source of webmaster support. Webmasters had tens of thousands of questions answered, including over 35,000 by users designated as Webmaster Top Contributors. Also, 56 Webmaster Top Contributors joined us at our Top Contributor Summit to discuss how to provide users and webmasters with better support and tools. We’re grateful for our awesome Top Contributors and their tremendous contributions!

We’re continuously improving our spam-fighting technology and working closely with webmasters and users to foster and support a high-quality web ecosystem. (In fact, fighting webspam is one of the many ways we maintain search quality at Google.) Thanks for helping to keep spammers away so users can continue accessing great content in Google Search.

Posted by Kiyotaka Tanaka and Mary Chen, User Education and Search Outreach

Helping webmasters re-secure their sites

(Cross-posted from the Google Security Blog.)
Every week, over 10 million users encounter harmful websites that deliver malware and scams. Many of these sites are compromised personal blogs or small business pages that have fallen victim due to a weak password or outdated software. Safe Browsing and Google Search protect visitors from dangerous content by displaying browser warnings and labeling search results with ‘this site may harm your computer’. While this helps keep users safe in the moment, the compromised site remains a problem that needs to be fixed.

Unfortunately, many webmasters for compromised sites are unaware anything is amiss. Worse yet, even when they learn of an incident, they may lack the security expertise to take action and address the root cause of compromise. Quoting one webmaster from a survey we conducted, “our daily and weekly backups were both infected” and even after seeking the help of a specialist, after “lots of wasted hours/days” the webmaster abandoned all attempts to restore the site and instead refocused his efforts on “rebuilding the site from scratch”.

In order to find the best way to help webmasters clean-up from compromise, we recently teamed up with the University of California, Berkeley to explore how to quickly contact webmasters and expedite recovery while minimizing the distress involved. We’ve summarized our key lessons below. The full study, which you can read here, was recently presented at the International World Wide Web Conference.

When Google works directly with webmasters during critical moments like security breaches, we can help 75% of webmasters re-secure their content. The whole process takes a median of 3 days. This is a better experience for webmasters and their audience.
How many sites get compromised?

Number of freshly compromised sites Google detects every week.

Over the last year Google detected nearly 800,000 compromised websites—roughly 16,500 new sites every week from around the globe. Visitors to these sites are exposed to low-quality scam content and malware via drive-by downloads. While browser and search warnings help protect visitors from harm, these warnings can at times feel punitive to webmasters who learn only after-the-fact that their site was compromised. To balance the safety of our users with the experience of webmasters, we set out to find the best approach to help webmasters recover from security breaches and ultimately reconnect websites with their audience.
Finding the most effective ways to aid webmaster

  1. Getting in touch with webmasters: One of the hardest steps on the road to recovery is first getting in contact with webmasters. We tried three notification channels: email, browser warnings, and search warnings. For webmasters who proactively registered their site with Search Console, we found that email communication led to 75% of webmasters re-securing their pages. When we didn’t know a webmaster’s email address, browser warnings and search warnings helped 54% and 43% of sites clean up respectively.
  2. Providing tips on cleaning up harmful content: Attackers rely on hidden files, easy-to-miss redirects, and remote inclusions to serve scams and malware. This makes clean-up increasingly tricky. When we emailed webmasters, we included tips and samples of exactly which pages contained harmful content. This, combined with expedited notification, helped webmasters clean up 62% faster compared to no tips—usually within 3 days.
  3. Making sure sites stay clean: Once a site is no longer serving harmful content, it’s important to make sure attackers don’t reassert control. We monitored recently cleaned websites and found 12% were compromised again in 30 days. This illustrates the challenge involved in identifying the root cause of a breach versus dealing with the side-effects.
Making security issues less painful for webmasters—and everyone
We hope that webmasters never have to deal with a security incident. If you are a webmaster, there are some quick steps you can take to reduce your risk. We’ve made it easier to receive security notifications through Google Analytics as well as through Search Console. Make sure to register for both services. Also, we have laid out helpful tips for updating your site’s software and adding additional authentication that will make your site safer.
If you’re a hosting provider or building a service that needs to notify victims of compromise, understand that the entire process is distressing for users. Establish a reliable communication channel before a security incident occurs, make sure to provide victims with clear recovery steps, and promptly reply to inquiries so the process feels helpful, not punitive.
As we work to make the web a safer place, we think it’s critical to empower webmasters and users to make good security decisions. It’s easy for the security community to be pessimistic about incident response being ‘too complex’ for victims, but as our findings demonstrate, even just starting a dialogue can significantly expedite recovery.

Posted by Kurt Thomas and Yuan Niu, Spam & Abuse Research

No More Deceptive Download Buttons

(Cross-posted from the Google Security Blog.)
In November, we announced that Safe Browsing would protect you from social engineering attacks – deceptive tactics that try to trick you into doing something dangerous, like installing unwanted software or revealing your personal information (for example, passwords, phone numbers, or credit cards). You may have encountered social engineering in a deceptive download button, or an image ad that falsely claims your system is out of date. Today, we’re expanding Safe Browsing protection to protect you from such deceptive embedded content, like social engineering ads.

Consistent with the social engineering policy we announced in November, embedded content (like ads) on a web page will be considered social engineering when they either:

  • Pretend to act, or look and feel, like a trusted entity — like your own device or browser, or the website itself. 
  • Try to trick you into doing something you’d only do for a trusted entity — like sharing a password or calling tech support.
Below are some examples of deceptive content, shown via ads:
This image claims that your software is out-of-date to trick you into clicking “update”. 

This image mimics a dialogue from the FLV software developer — but it does not actually originate from this developer.
These buttons seem like they will produce content that relate to the site (like a TV show or sports video stream) by mimicking the site’s look and feel. They are often not distinguishable from the rest of the page.
Our fight against unwanted software and social engineering is still just beginning. We’ll continue to improve Google’s Safe Browsing protection to help more people stay safe online.
Will my site be affected?
If visitors to your web site consistently see social engineering content, Google Safe Browsing may warn users when they visit the site. If your site is flagged for containing social engineering content, you should troubleshoot with Search Console. Check out our social engineering help for webmasters.

Posted by Lucas Ballard, Safe Browsing Team

Continuing to make the web more mobile friendly

Getting good, relevant answers when you search shouldn’t depend on what device you’re using. You should get the best answer possible, whether you’re on a phone, desktop or tablet. Last year, we started using mobile-friendliness as a ranking signal on mobile searches. Today we’re announcing that beginning in May, we’ll start rolling out an update to mobile search results that increases the effect of the ranking signal to help our users find even more pages that are relevant and mobile-friendly.

If you’ve already made your site mobile-friendly, you will not be impacted by this update. If you need support with your mobile-friendly site, we recommend checking out the Mobile-Friendly Test and the Webmaster Mobile Guide, both of which provide guidance on how to improve your mobile site. And remember, the intent of the search query is still a very strong signal — so even if a page with high quality content is not mobile-friendly, it could still rank well if it has great, relevant content.

If you have any questions, please go to the Webmaster help forum.

Posted by Klemen Kloboves, Software Engineer

Updating the smartphone user-agent of Googlebot

As technology on the web changes, we periodically update the user-agents we use for Googlebot. Next month, we will be updating the smartphone user-agent of Googlebot:

Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
(Googlebot smartphone user-agent starting from April 18, 2016)

Today, we use the following smartphone user-agent for Googlebot:

Mozilla/5.0 (iPhone; CPU iPhone OS 8_3 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12F70 Safari/600.1.4 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
(Current Googlebot smartphone user-agent)

We’re updating the user-agent string so that our renderer can better understand pages that use newer web technologies. Our renderer evolves over time and the user-agent string indicates that that it is becoming more similar to Chrome than Safari. To make sure your site can be viewed properly by a wide range of users and browsers, we recommend using feature detection and progressive enhancement.

Our evaluation suggests that this user-agent change should have no effect on 99% of sites. The most common reason a site might be affected is if it specifically looks for a particular Googlebot user-agent string. User-agent sniffing for Googlebot is not recommended and is considered to be a form of cloaking. Googlebot should be treated like any other browser.

If you believe your site may be affected by this update, we recommend checking your site with the Fetch and Render Tool in Search Console (which has been updated with the new user-agent string) or by changing the user-agent string in Developer Tools in your browser (for example, via Chrome Device Mode). If you have any questions, we’re always happy to answer them in our Webmaster help forums.

Posted by Katsuaki Ikegami, Software Engineer

Best practices for bloggers reviewing free products they receive from companies

As a form of online marketing, some companies today will send bloggers free products to review or give away in return for a mention in a blogpost. Whether you’re the company supplying the product or the blogger writing the post, below are a few best practices to ensure that this content is both useful to users and compliant with Google Webmaster Guidelines.

  1. Use the nofollow tag where appropriate

    Links that pass PageRank in exchange for goods or services are against Google guidelines on link schemes. Companies sometimes urge bloggers to link back to:

    1. the company’s site
    2. the company’s social media accounts
    3. an online merchant’s page that sells the product
    4. a review service’s page featuring reviews of the product
    5. the company’s mobile app on an app store

    Bloggers should use the nofollow tag on all such links because these links didn’t come about organically (i.e., the links wouldn’t exist if the company hadn’t offered to provide a free good or service in exchange for a link). Companies, or the marketing firms they’re working with, can do their part by reminding bloggers to use nofollow on these links.

  2. Disclose the relationship

    Users want to know when they’re viewing sponsored content. Also, there are laws in some countries that make disclosure of sponsorship mandatory. A disclosure can appear anywhere in the post; however, the most useful placement is at the top in case users don’t read the entire post.

  3. Create compelling, unique content

    The most successful blogs offer their visitors a compelling reason to come back. If you’re a blogger you might try to become the go-to source of information in your topic area, cover a useful niche that few others are looking at, or provide exclusive content that only you can create due to your unique expertise or resources.

For more information, please drop by our Google Webmaster Central Help Forum.

Posted by the Google Webspam Team

An update on the Webmaster Central Blog

We’ve got a new URL!

You may have noticed the Google Webmaster Central blog has a new address: webmasters.googleblog.com.

That’s because starting today, Google is moving its blogs to a new domain to help people recognize when they’re reading an official blog from Google. These changes will roll out to all of Google’s blogs over time.

The previous address will redirect to the new domain, so your bookmarks and links will continue to work. Unfortunately, as with a custom domain change in Blogger, the Google+ comments on the blogs have been reset.

Thanks as always for reading—we’ll see you here again soon at webmasters.googleblog.com!

Posted by John Mueller, Webmaster Trends Analyst, Zürich

AMP NewsLab Office Hours in your language

Accelerated Mobile Pages (AMP) is a global, industry-wide initiative, with publishers large and small all focused on the same goal: a better, faster mobile web.
We’ve had a great response to our English language AMP office hours, but we know that English isn’t everyone’s native language.

For the next two weeks, we’re rolling out a new series of office hours in French, Italian, German, Spanish, Brazilian Portuguese, Russian, Japanese, and Indonesian and invite everyone to learn about AMP in their native language. Product Managers, Technical Managers, & Engineers at Google, will get to speak in their native tongue, and answer any questions you may have on AMP.

First we will reintroduce you to AMP and how it works, before diving into the technical specs and various components of AMP. You can add your questions via the Q and A app on the event pages below, and we will answer them during the office hours. You can also watch them on the News Lab YouTube page after the event.
Check out the lineup below and join the discussion.

  • French
    • Introduction to AMP – Mar. 7 @ 1700 CET with Cecile Pruvost, Industry Manager
    • AMP Anatomy – Mar. 14 @ 1700 CET with Emeric Studer, Technology Manager
  • Italian
    • Introduction to AMP – Mar. 8 @ 1500 CET with Luca Forlin Head of International Play Newsstand Partnerships
    • AMP Anatomy – Mar. 15 @ 1500 CET with Flavio Palandri Antonelli, AMP Software Engineer
  • German
    • Introduction to AMP – Mar. 9 @ 1700 CET with Nadine Gerspacher, Partner Development Manager
    • AMP Anatomy – Mar. 18 @ 1600 CET with Paul Bakaus, Developer Advocate
  • Spanish
    • Introduction to AMP – Mar. 9 @ 1430 CET with Demian Renzulli, Technical Solutions Consultant
    • AMP Anatomy – Mar. 16 @ 1430 CET with Julian Toledo, Developer Advocate
  • Brazilian Portuguese
    • Introduction to AMP – Mar. 10 @ 1430 BRT with Carol Soler, Strategic Partner Manager
    • AMP Anatomy – Mar. 17 @ 1430 BRT with Breno Araújo, Technology Manager
  • Russian
  • Japanese
  • Indonesian

Posted by Tomo Taylor, AMP Community Manager

New year, new look: Introducing our new Webmasters website

It’s a new year and a perfect time to share with you our brand new Webmasters website.

We spent a lot of time making this site right for you. We took our own advice by analyzing visitor behavior and conducting user studies to organize the site into categories you’ll find most useful. Thanks to our awesome community and Top Contributors for the valuable feedback during the process!

Our new Google Webmasters website

The site contains support resources to help you fix issues with your website, SEO learning materials to create a high-quality site and improve search rankings, and connection opportunities to stay up-to-date with our team and webmaster community. It also contains new features such as:

  • Webmaster troubleshooter: Need a step-by-step guide to move your site or understand a message in Search Console? The troubleshooter can help answer these and other common problems with your site in Google Search and Google Search Console.
  • Popular resources: Looking for popular Google Webmasters YouTube videos, blog posts and forum threads? Here’s a curated list of our top resources – these may differ across languages.
  • Events calendar: Want to meet someone from our team online for office hours or at a live event near you? We have office hours and events in multiple languages around the world. 

Browse around and let us know in the comments below if you stumble onto something new!

Posted by Mary Chen, Senior Webmaster Relations Specialist

Indexing HTTPS pages by default

At Google, user security has always been a top priority. Over the years, we’ve worked hard to promote a more secure web and to provide a better browsing experience for users. Gmail, Google search, and YouTube have had secure connections for some time, and we also started giving a slight ranking boost to HTTPS URLs in search results last year. Browsing the web should be a private experience between the user and the website, and must not be subject to eavesdropping, man-in-the-middle attacks, or data modification. This is why we’ve been strongly promoting HTTPS everywhere.

As a natural continuation of this, today we’d like to announce that we’re adjusting our indexing system to look for more HTTPS pages. Specifically, we’ll start crawling HTTPS equivalents of HTTP pages, even when the former are not linked to from any page. When two URLs from the same domain appear to have the same content but are served over different protocol schemes, we’ll typically choose to index the HTTPS URL if:

  • It doesn’t contain insecure dependencies.
  • It isn’t blocked from crawling by robots.txt.
  • It doesn’t redirect users to or through an insecure HTTP page.
  • It doesn’t have a rel=”canonical” link to the HTTP page.
  • It doesn’t contain a noindex robots meta tag.
  • It doesn’t have on-host outlinks to HTTP URLs.
  • The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL
  • The server has a valid TLS certificate.

Although our systems prefer the HTTPS version by default, you can also make this clearer for other search engines by redirecting your HTTP site to your HTTPS version and by implementing the HSTS header on your server.

We’re excited about taking another step forward in making the web more secure. By showing users HTTPS pages in our search results, we’re hoping to decrease the risk for users to browse a website over an insecure connection and making themselves vulnerable to content injection attacks. As usual, if you have any questions or comments, please let us know in the comments section below or in our webmaster help forums.

Posted by Zineb Ait Bahajji, WTA, and the Google Security and Indexing teams

Updating Our Search Quality Rating Guidelines

Developing algorithmic changes to search involves a process of experimentation. Part of that experimentation is having evaluators—people who assess the quality of Google’s search results—give us feedback on our experiments. Ratings from evaluators do not determine individual site rankings, but are used help us understand our experiments. The evaluators base their ratings on guidelines we give them; the guidelines reflect what Google thinks search users want.

In 2013, we published our human rating guidelines to provide transparency on how Google works and to help webmasters understand what Google looks for in web pages. Since that time, a lot has changed: notably, more people have smartphones than ever before and more searches are done on mobile devices today than on computers.

We often make changes to the guidelines as our understanding of what users wants evolves, but we haven’t shared an update publicly since then. However, we recently completed a major revision of our rater guidelines to adapt to this mobile world, recognizing that people use search differently when they carry internet-connected devices with them all the time. You can find that update here (PDF).

This is not the final version of our rater guidelines. The guidelines will continue to evolve as search, and how people use it, changes. We won’t be updating the public document with every change, but we will try to publish big changes to the guidelines periodically.

We expect our phones and other devices to do a lot, and we want Google to continue giving users the answers they’re looking for—fast!

Posted by Mimi Underwood, Sr. Program Manager, Search Growth & Analysis

TC Summit 2015: Celebrating our Webmaster Top Contributors!

Two weeks ago, we were extremely lucky to host the 2015 edition of the Top Contributor Summit (#TCsummit), in San Francisco and on Google’s campus in Mountain View, California.

Google Top Contributors are an exceptional group of passionate Google product enthusiasts who share their expertise across our international help forums to support millions of Google users every year. Google’s Top Contributor Summit is an event organised every two years, to celebrate these amazing users. This year we had the pleasure to welcome 526 Top Contributors, from all around the world.

Under the motto “Learn, Connect, Celebrate”, Top Contributors had the chance to learn more about our products, get insights on the future of Google, connect with Googlers and Top Contributors from various products and, finally, to celebrate their positive impact on our products and users.

Footage of the 2015 Top Contributor Summit

We also had the chance to hold Webmaster-specific sessions, which gave Googlers the unique opportunity to meet 56 of our Webmaster Top Contributors, representing 20 countries and speaking 14 different languages.


Group photo of the Webmaster Top Contributor community and the Google Webmaster Relations team

Throughout the day, we had in-depth sessions about Google Webmaster guidelines, Search Console and Google Search. We discussed the most common issues that users are bringing up in our international webmaster forums, and listened to the Top Contributors’ feedback regarding our Search tools. We also talked about the Top Contributor program itself and additional opportunities for our users to benefit from both Google and the TCs’ support. Product managers, engineers and search quality Googlers attended the sessions to listen and bring the feedback given by Top Contributors and users on the forum back to their teams.


Webmaster Top Contributors during the in-depth sessions about Google Webmaster guidelines, Search Console and Google Search

At Google, we are grateful to have the incredible opportunity to meet and connect with some of the most insightful members of the webmaster community and get their feedback on such important topics. It helps us be sure that Google keeps focusing on what really matters to webmasters, content creators, and users.

To learn more about our Top Contributor Program, or to give us your own feedback, visit our Top Contributor homepage or join our Webmaster help forum.

Diogo Botelho and Roberta Remigi, Webmaster Relations team

Detect and get rid of unwanted sneaky mobile redirects

In many cases, it is OK to show slightly different content on different devices. For example, optimizing the smaller space of a smartphone screen can mean that some content, like images, will have to be modified. Or you might want to store your website’s menu in a navigation drawer (find documentation here) to make mobile browsing easier and more effective. When implemented properly, these user-centric modifications can be understood very well by Google.

The situation is similar when it comes to mobile-only redirect. Redirecting mobile users to improve their mobile experience (like redirecting mobile users from example.com/url1 to m.example.com/url1) is often beneficial to them. But redirecting mobile users sneakily to a different content is bad for user experience and is against Google’s webmaster guidelines.


A frustrating experience: The same URL shows up in search results pages on desktop and on mobile. When a user clicks on this result on their desktop computer, the URL opens normally. However, when clicking on the same result on a smartphone, a redirect happens and an unrelated URL loads.

Who implements these mobile-only sneaky redirects?

There are cases where webmasters knowingly decide to put into place redirection rules for their mobile users. This is typically a webmaster guidelines violation, and we do take manual action against it when it harms Google users’ experience (see last section of this article).   

But we’ve also observed situations where mobile-only sneaky redirects happen without site owners being aware of it:

  • Advertising schemes that redirect mobile users specifically
    A script/element installed to display ads and monetize content might be redirecting mobile users to a completely different site without the webmaster being aware of it.
  • Mobile redirect as a result of the site being a target of hacking
    In other cases, if your website has been hacked, a potential result can be redirects to spammy domains for mobile users only.

How do I detect if my site is doing sneaky mobile redirects?

  1. Check if you are redirected when you navigate to your site on your smartphone
    We recommend you to check the mobile user experience of your site by visiting your pages from Google search results with a smartphone. When debugging, mobile emulation in desktop browsers is handy, mostly because you can test for many different devices. You can, for example, do it straight from your browser in Chrome, Firefox or Safari (for the latter, make sure you have enabled the “Show Develop menu in menu bar” feature).
  1. Listen to your users
    Your users could see your site in a different way than you do. It’s always important to pay attention to user complaints, so you can hear of any issue related to mobile UX.
  2. Monitor your users in your site’s analytics data
    Unusual mobile user activity could be detected by looking at some of the data held in your website’s analytics data. For example, looking at the average time spent on your site by your mobile users could be a good signal to watch: if all of a sudden, your mobile users (and only them) start spending much less time on your site than they used to, there might be an issue related to mobile redirections.

    To be aware of wide changes in mobile user activity as soon as they happen, you can for example set up Google Analytics alerts. For example, you can set an alert to be warned in case of a sharp drop in average time spent on your site by mobile users, or a drop in mobile users (always take into account that big changes in those metrics are not a clear, direct signal that your site is doing mobile sneaky redirects).

I’ve detected sneaky redirects for my mobile users, and I did not set it up: what do I do?

  1. Make sure that your site is not hacked.
    Check the Security Issues tool in the Search Console, if we have noticed any hack, you should get some information there.
    Review our additional resources on typical symptoms of hacked sites, and our case studies on hacked sites.
  2. Audit third-party scripts/elements on your site
    If your site is not hacked, then we recommend you take the time to investigate if third-party scripts/elements are causing the redirects. You can follow these steps:
    A. Remove one by one the third-party scripts/elements you do not control from the redirecting page(s).
    B. Check your site on a mobile device or through emulation between each script/element removal, and see when the redirect stops.
    C. If you think a particular script/element is responsible for the sneaky redirect, consider removing it from your site, and debugging the issue with the script/element provider.

Last Thoughts on Sneaky Mobile Redirects

It’s a violation of the Google Webmaster Guidelines to redirect a user to a page with the intent of displaying content other than what was made available to the search engine crawler (more information on sneaky redirects). To ensure quality search results for our users, the Google Search Quality team can take action on such sites, including removal of URLs from our index.  When we take manual action, we send a message to the site owner via Search Console. Therefore, make sure you’ve set up a Search Console account.

Be sure to choose advertisers who are transparent on how they handle user traffic, to avoid unknowingly redirecting your own users. If you are interested in trust-building in the online advertising space, you may check out industry-wide best practices when participating in ad networks. For example, the Trustworthy Accountability Group’s (Interactive Advertising Bureau) Inventory Quality Guidelines are a good place to start. There are many ways to monetize your content with mobile solutions that provide a high quality user experience, be sure to use them.

If you have questions or comments about mobile-only redirects, join us in our Google Webmaster Support forum.

Written by Vincent Courson & Badr Salmi El Idrissi, Search Quality team

Deprecating our AJAX crawling scheme

tl;dr: We are no longer recommending the AJAX crawling proposal we made back in 2009.

In 2009, we made a proposal to make AJAX pages crawlable. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users. Because “crawlers … [were] not able to see any content … created dynamically,” we proposed a set of practices that webmasters can follow in order to ensure that their AJAX-based applications are indexed by search engines.

Times have changed. Today, as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site’s CSS or JS files.

Since the assumptions for our 2009 proposal are no longer valid, we recommend following the principles of progressive enhancement. For example, you can use the History API pushState() to ensure accessibility for a wider range of browsers (and our systems).

Questions and answers

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.

Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok?
A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.

If you have any questions, feel free to post them here, or in the webmaster help forum.

Posted by , Search Quality Analyst

An update on how we tackle hacked spam

Recently we have started rolling out a series of algorithmic changes that aim to tackle hacked spam in our search results. A huge amount of legitimate sites are hacked by spammers and used to engage in abusive behavior, such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs, etc.

Website owners that don’t implement standard best practices for security can leave their websites vulnerable to being easily hacked. This can include government sites, universities, small business, company websites, restaurants, hobby organizations, conferences, etc. Spammers and cyber-criminals purposely seek out those sites and inject pages with malicious content in an attempt to gain rank and traffic in search engines.

We are aggressively targeting hacked spam in order to protect users and webmasters.

The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. As we roll out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown:

This is due to the large amount of hacked spam being removed, and should improve in the near future. We are continuing tuning our systems to weed out the bad content while retaining the organic, legitimate results. If you have any questions about these changes, or want to give us feedback on these algorithms, feel free to drop by our Webmaster Help Forums.

Posted by Ning Song, Software Engineer

First Click Free update

Around ten years ago when we introduced a policy called “First Click Free,” it was hard to imagine that the always-on, multi-screen, multiple device world we now live in would change content consumption so much and so fast. The spirit of the First Click Free effort was – and still is – to help users get access to high quality news with a minimum of effort, while also ensuring that publishers with a paid subscription model get discovered in Google Search and via Google News.

In 2009, we updated the FCF policy to allow a limit of five articles per day, in order to protect publishers who felt some users were abusing the spirit of this policy. Recently we have heard from publishers about the need to revisit these policies to reflect the mobile, multiple device world. Today we are announcing a change to the FCF limit to allow a limit of three articles a day. This change will be valid on both Google Search and Google News.

Google wants to play its part in connecting users to quality news and in connecting publishers to users. We believe the FCF is important in helping achieve that goal, and we will periodically review and update these policies as needed so they continue to benefit users and publishers alike. We are listening and always welcome feedback.

Questions and answers about First Click Free

Q: Do the rest of the old guidelines still apply?
A: Yes, please check the guidelines for Google News as well as the guidelines for Web Search and the associated blog post for more information.

Q: Can I apply First Click Free to only a section of my site / only for Google News (or only for Web Search)?
A: Sure! Just make sure that both Googlebot and users from the appropriate search results can view the content as required. Keep in mind that showing Googlebot the full content of a page while showing users a registration page would be considered cloaking.

Q: Do I have to sign up to use First Click Free?
A: Please let us know about your decision to use First Click Free if you are using it for Google News. There’s no need to inform us of the First Click Free status for Google Web Search.

Q: What is the preferred way to count a user’s accesses?
A: Since there are many different site architectures, we believe it’s best to leave this up to the publisher to decide.

(Please see our related blog post for more information on First Click Free for Google News.)

Posted by John Mueller, Google Switzerland