How we fought Search spam on Google – Webspam Report 2019

Every search matters. That is why whenever you come to Google Search to find relevant and useful information, it is our ongoing commitment to make sure users receive the highest quality results possible.

Unfortunately, on the web there are some disruptive behaviors and content that we call “webspam” that can degrade the experience for people coming to find helpful information. We have a number of teams who work to prevent webspam from appearing in your search results, and it’s a constant challenge to stay ahead of the spammers. At the same time, we continue to engage with webmasters to ensure they’re following best practices and can find success on Search, making great content available on the open web.

Looking back at last year, here’s a snapshot of how we fought spam on Search in 2019, and how we supported the webmaster community.
Fighting Spam at Scale

With hundreds of billions of webpages in our index serving billions of queries every day, perhaps it’s not too surprising that there continue to be bad actors who try to manipulate search ranking. In fact, we observed that more than 25 Billion pages we discover each day are spammy. That’s a lot of spam and it goes to show the scale, persistence, and the lengths that spammers are willing to go. We’re very serious about making sure that your chance of encountering spammy pages in Search is as small as possible. Our efforts have helped ensure that more than 99% of visits from our results lead to spam-free experiences.
Updates from last year

In 2018, we reported that we had reduced user-generated spam by 80%, and we’re happy to confirm that this type of abuse did not grow in 2019. Link spam continued to be a popular form of spam, but our team was successful in containing its impact in 2019. More than 90% of link spam was caught by our systems, and techniques such as paid links or link exchange have been made less effective.
Hacked spam, while still a commonly observed challenge, has been more stable compared to previous years. We continued to work on solutions to better detect and notify affected webmasters and platforms and help them recover from hacked websites.
Spam Trends

One of our top priorities in 2019 was improving our spam fighting capabilities through machine learning systems. Our machine learning solutions, combined with our proven and time-tested manual enforcement capability, have been instrumental in identifying and preventing spammy results from being served to users.
In the last few years, we’ve observed an increase in spammy sites with auto-generated and scraped content with behaviors that annoy or harm searchers, such as fake buttons, overwhelming ads, suspicious redirects and malware. These websites are often deceptive and offer no real value to people. In 2019, we were able to reduce the impact on Search users from this type of spam by more than 60% compared to 2018.
As we improve our capability and efficiency in catching spam, we’re continuously investing in reducing broader types of harm, like scams and fraud. These sites trick people into thinking they’re visiting an official or authoritative site and in many cases, people can end up disclosing sensitive personal information, losing money, or infecting their devices with malware. We have been paying close attention to queries that are prone to scam and fraud and we’ve worked to stay ahead of spam tactics in those spaces to protect users.
Working with webmasters and developers for a better web
Much of the work we do to fight against spam is using automated systems to detect spammy behavior, but those systems aren’t perfect and can’t catch everything. As someone who uses Search, you can also help us fight spam and other issues by reporting spam on search, phishing or malware. We received nearly 230,000 reports of search spam in 2019, and we were able to take action on 82% of those reports we processed. We appreciate all the reports you sent to us and your help in keeping search results clean!
So what do we do when we get those reports or identify that something isn’t quite right? An important part of what we do is notifying webmasters when we detect something wrong with their website. In 2019, we generated more than 90 million messages to website owners to let them know about issues, problems that may affect their site’s appearance on Search results and potential improvements that can be implemented. Of all messages, about 4.3 million were related to manual actions, resulting from violations of our Webmaster Guidelines.
And we’re always looking for ways to better help site owners. There were many initiatives in 2019 aimed at improving communications, such as the new Search Console messages, Site Kit for WordPress sites or the Auto-DNS verification in the new Search Console. We hope that these initiatives have equipped webmasters with more convenient ways to get their sites verified and will continue to be helpful. We also hope this provides quicker access to news and that webmasters will be able to fix webspam issues or hack issues more effectively and efficiently.
While we deeply focused on cleaning up spam, we also didn’t forget to keep up with the evolution of the web and rethought how we wanted to treat “nofollow” links. Originally introduced as a means to help fight comment spam and annotate sponsored links, the “nofollow” attribute has come a long way. But we’re not stopping there. We believe it’s time for it to evolve even more, just as how our spam fighting capability has evolved. We introduced two new link attributes, rel=”sponsored” and rel=”ugc”, that provide webmasters with additional ways to identify to Google Search the nature of particular links. Along with rel=”nofollow”, we began treating these as hints for us to incorporate for ranking purposes. We are very excited to see that these new rel attributes were well received and adopted by webmasters around the world!
Engaging with the community
As always, we’re grateful for all the opportunities we had last year to connect with webmasters around the world, helping them improve their presence in Search and hearing feedback. We delivered more than 150 online office hours, online events and offline events in many cities across the globe to a wide range of audience including SEOs, developers, online marketers and business owners. Among those events, we have been delighted by the momentum behind our Webmaster Conferences in 35 locations across 15 countries and 12 languages around the world, including the first Product Summit version in Mountain View. While we’re not currently able to host in-person events, we look forward to more of these events and virtual touchpoints in the future.
Webmasters continued to find solutions and tips on our Webmasters Help Community with more than 30,000 threads in 2019 in more than a dozen languages. On YouTube, we launched #AskGoogleWebmasters as well as series such as SEO mythbusting to ensure that your questions get answered and your uncertainties get clarified.
We know that our journey to better web with you is ongoing and we would love to continue this with you in the year to come! Therefore, do keep in touch on Twitter, YouTube, blog, Help Community or see you in person at one of our conferences near you!

Posted by Cherry Prommawin, Search Relations, and Duy Nguyen, Search Quality Analyst

Video Series for New Webmasters: Search for Beginners!

We are excited to introduce our newest video series: “Search For Beginners”! The series was created primarily to help new webmasters. It is also for anyone with an interest in Search or anyone who is still learning about the Web and how to manage their online presence.

We love to see the webmaster community grow! Every day, there are countless new webmasters who are taking the first steps in learning how Search works, and how to make their websites perform well and discoverable on Search. We understand that it sometimes can be challenging or even overwhelming to start with our existing content without some prior knowledge or basic understandings of the Web. We find our basic videos in our YouTube channels to be the ones with the most views. At the same time, advanced webmasters also see the need for content that can be sent to clients or stakeholders to help explain important concepts in managing an online presence.

We want to help all webmasters succeed, regardless of whether you have been managing websites for many years or you’ve just started out yesterday. We want to do more to help the new webmasters and this video series will hopefully help us achieve that.

Introduction to the series:

Episode 1:

The “Search For Beginners” video series covers basic online presence topics ranging from ‘Do you need a website?’, ‘What are the goals for your website?’ to more organic search-related topics such as ‘How does Google Search work?’, ‘How to change description line’, or ‘How to change wrong address information on Google’. Actually, we get asked these questions frequently in forums, social channels and at events around the world! The videos are fully animated. The videos are in English with subtitles available in Spanish, Portuguese, Korean, Chinese, Indonesian, Italian, Japanese, and English. We are working on more, so please stay tuned!

And if you consider yourself a more experienced user, please feel free to use these videos to support your pitches or explaining things to your clients. If you want to share any ideas or learnings, please leave them in the comment section in each video so that others can benefit from your knowledge and experience.

Follow us on Twitter and subscribe on YouTube for the upcoming videos! We will be adding new videos in this series to this playlist about every two weeks!

Posted by Cherry Prommawin, Search Quality Analyst

This year in Search Spam – Webspam report 2018

Google aims to provide the highest quality results for any search. As part of this, we take action to prevent what we call “webspam” from degrading the search experience, content and behaviors that violate our webmaster guidelines. Our efforts help ensure that well under 1 percent of results visited by users are for spammy pages. Here’s more about how we fought webspam in 2018.

Google webspam trends and how we fought webspam in 2018

Of the types of spam we fought in 2018, three continue to stand out:
Spam on hacked websites: We reported in 2017 that we had seen a substantial reduction of spam from hacked websites in search results. This trend continued in 2018, with faster discovery of hacked web pages before they affect search results or put someone in harm’s way.   While we reduced how spam on hacked sites affects search, hacked websites remain a major security problem affecting the safety of the web. Even though we can’t prevent a website hack from happening, we’re committed to helping webmasters whose websites have been compromised by offering resources to help them recover from a hacked website. 
User-generated spam: A particular type of spam known as User-generated spam has been a continued focus for us. User-generated spam includes spammy posts on forums, as well as spammy accounts on free blogs and platforms, none of which are meant to be consumed by human beings, and all of which disrupt conversations while adding no value to users. In 2018, we were able to reduce the impact on search users from this type of spam by more than 80%. While we can’t prevent websites from being exploited, we do want to make it easier for website owners to learn how to protect themselves, which is why we provide resources on how to prevent abuse of your site’s public areas.
Link spam: We continued to protect the value of authoritative and relevant links as an important ranking signal for Search. We continued to deal swiftly with egregious link spam, and made a number of bad linking practices less effective for manipulating ranking. Above all, we continued to engage with webmasters and SEOs to chip away at the many myths that have emerged over the years relating to linking practices. We continued to remind website owners that if you simply stay away from building links mainly as an attempt to rank better and focus on creating great content, you should not have to worry about any of the myths or realities. We think that one of the best ways of fighting spam of all types is by encouraging website owners to just create great quality content. Resources such as the SEO starter-guide highlight best practices and bust some common myths and misconceptions related to what it takes to appear well in Google Search results. Reporting link spam is also a great way to assist us in fighting against this type of abuse and to help preserve fairness in Search ranking.

Working with users, webmasters and developers for a better web

Everyday users continue to help us find spam, malware and other issues in Search that escape our filters and processes by reporting spam on search, reporting phishing or  reporting malware. We received over 180,000 search spam user reports and we were able to take action on 64% of the reports we processed. These reports truly make a difference and we’d like to thank all of you who submitted them. 
We think it’s important to let website owners known when we detect something wrong with their website. In 2018, we generated over 186 million messages to website owners calling out potential improvements, issues and problems that could affect their site’s appearance on Search results. We can only deliver these notifications to site owners that verified their sites in Search Console, and we successfully delivered 96 million of those messages. The rest of the messages will be kept linked with the website for as long as they are relevant, so they can be seen when a webmaster successfully registers their site in Search Console. The majority of these messages were welcoming new users to Search Console, and the second largest group was informing registered Search Console users when Mobile-First Indexing became available. Of all messages, slightly over 2%—about 4 million—were related to manual actions resulting from violations of our Webmaster Guidelines. 
High quality content keeps spam off of search results, and we continued to improve the tools and reports we offer for webmasters that create that content. The Google Search Console was completely rebuilt from the ground up to provide both new and improved reports (Performance, Index Coverage, Links, Mobile Usability report), as well as brand new features (URL Inspection Tool and Site and User management). This improved Search Console graduated out of beta in 2018 and is now available generally to all registered website owners.
We didn’t forget the front-end developers who make the modern web work, and focused on helping them make their sites great for users and also search-friendly regardless of whether they are on a CMS, roll their own CSS and JS, or build on top of a web framework. With the new SEO audit capability in Lighthouse, the open-source and automated auditing tool for improving the quality of web pages, developers and webmasters can now run actionable SEO health-checks on their pages and quickly identify areas for improvement.
We also engage directly with website owners to provide help with thorny issues. Our dedicated team members meet with webmasters around the world regularly, both online and in-person. We delivered more than 190 online office hours, online events and offline events in more than 76 cities, to audiences totaling over 170,000 including SEOs, developers and online marketers. We hosted four search events in Tokyo, Singapore, Zurich and Osaka as well as an 11-city Search Conference in India. In 2018, we started live office hours in Spanish on top of English, French, German, Hindi and Japanese, where Webmasters can find help, tips and useful discussion on our Google Webmaster YouTube channel. Product experts continued to help webmasters find solutions through our official support forums in over a dozen languages. 
We look forward to continuing our work to deliver a spam-free Search experience to all in 2019!
Posted by Juan Felipe Rincón, Webmaster Outreach, Dublin

This year in Search Spam – Webspam report 2018

Google aims to provide the highest quality results for any search. As part of this, we take action to prevent what we call “webspam” from degrading the search experience, content and behaviors that violate our webmaster guidelines. Our efforts help ensure that well under 1 percent of results visited by users are for spammy pages. Here’s more about how we fought webspam in 2018.

Google webspam trends and how we fought webspam in 2018

Of the types of spam we fought in 2018, three continue to stand out:
Spam on hacked websites: We reported in 2017 that we had seen a substantial reduction of spam from hacked websites in search results. This trend continued in 2018, with faster discovery of hacked web pages before they affect search results or put someone in harm’s way.   While we reduced how spam on hacked sites affects search, hacked websites remain a major security problem affecting the safety of the web. Even though we can’t prevent a website hack from happening, we’re committed to helping webmasters whose websites have been compromised by offering resources to help them recover from a hacked website. 
User-generated spam: A particular type of spam known as User-generated spam has been a continued focus for us. User-generated spam includes spammy posts on forums, as well as spammy accounts on free blogs and platforms, none of which are meant to be consumed by human beings, and all of which disrupt conversations while adding no value to users. In 2018, we were able to reduce the impact on search users from this type of spam by more than 80%. While we can’t prevent websites from being exploited, we do want to make it easier for website owners to learn how to protect themselves, which is why we provide resources on how to prevent abuse of your site’s public areas.
Link spam: We continued to protect the value of authoritative and relevant links as an important ranking signal for Search. We continued to deal swiftly with egregious link spam, and made a number of bad linking practices less effective for manipulating ranking. Above all, we continued to engage with webmasters and SEOs to chip away at the many myths that have emerged over the years relating to linking practices. We continued to remind website owners that if you simply stay away from building links mainly as an attempt to rank better and focus on creating great content, you should not have to worry about any of the myths or realities. We think that one of the best ways of fighting spam of all types is by encouraging website owners to just create great quality content. Resources such as the SEO starter-guide highlight best practices and bust some common myths and misconceptions related to what it takes to appear well in Google Search results. Reporting link spam is also a great way to assist us in fighting against this type of abuse and to help preserve fairness in Search ranking.

Working with users, webmasters and developers for a better web

Everyday users continue to help us find spam, malware and other issues in Search that escape our filters and processes by reporting spam on search, reporting phishing or  reporting malware. We received over 180,000 search spam user reports and we were able to take action on 64% of the reports we processed. These reports truly make a difference and we’d like to thank all of you who submitted them. 
We think it’s important to let website owners known when we detect something wrong with their website. In 2018, we generated over 186 million messages to website owners calling out potential improvements, issues and problems that could affect their site’s appearance on Search results. We can only deliver these notifications to site owners that verified their sites in Search Console, and we successfully delivered 96 million of those messages. The rest of the messages will be kept linked with the website for as long as they are relevant, so they can be seen when a webmaster successfully registers their site in Search Console. The majority of these messages were welcoming new users to Search Console, and the second largest group was informing registered Search Console users when Mobile-First Indexing became available. Of all messages, slightly over 2%—about 4 million—were related to manual actions resulting from violations of our Webmaster Guidelines. 
High quality content keeps spam off of search results, and we continued to improve the tools and reports we offer for webmasters that create that content. The Google Search Console was completely rebuilt from the ground up to provide both new and improved reports (Performance, Index Coverage, Links, Mobile Usability report), as well as brand new features (URL Inspection Tool and Site and User management). This improved Search Console graduated out of beta in 2018 and is now available generally to all registered website owners.
We didn’t forget the front-end developers who make the modern web work, and focused on helping them make their sites great for users and also search-friendly regardless of whether they are on a CMS, roll their own CSS and JS, or build on top of a web framework. With the new SEO audit capability in Lighthouse, the open-source and automated auditing tool for improving the quality of web pages, developers and webmasters can now run actionable SEO health-checks on their pages and quickly identify areas for improvement.
We also engage directly with website owners to provide help with thorny issues. Our dedicated team members meet with webmasters around the world regularly, both online and in-person. We delivered more than 190 online office hours, online events and offline events in more than 76 cities, to audiences totaling over 170,000 including SEOs, developers and online marketers. We hosted four search events in Tokyo, Singapore, Zurich and Osaka as well as an 11-city Search Conference in India. In 2018, we started live office hours in Spanish on top of English, French, German, Hindi and Japanese, where Webmasters can find help, tips and useful discussion on our Google Webmaster YouTube channel. Product experts continued to help webmasters find solutions through our official support forums in over a dozen languages. 
We look forward to continuing our work to deliver a spam-free Search experience to all in 2019!
Posted by Juan Felipe Rincón, Webmaster Outreach, Dublin

Help Google Search know the best date for your web page

Sometimes, Google shows dates next to listings in its search results. In this post, we’ll answer some commonly-asked questions webmasters have about how these dates are determined and provide some best practices to help improve their accuracy.

How dates are determined

Google shows the date of a page when its automated systems determine that it would be relevant to do so, such as for pages that can be time-sensitive, including news content:

Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the publisher through structured markup.

Google doesn’t depend on one single factor because all of them can be prone to issues. Publishers may not always provide a clear visible date. Sometimes, structured data may be lacking or may not be adjusted to the correct time zone. That’s why our systems look at several factors to come up with what we consider to be our best estimate of when a page was published or significantly updated.

How to specify a date on a page

To help Google to pick the right date, site owners and publishers should:

  • Show a clear date: Show a visible date prominently on the page.
  • Use structured data: Use the datePublished and dateModified schema with the correct time zone designator for AMP or non-AMP pages. When using structured data, make sure to use the ISO 8601 format for dates.

Guidelines specific to Google News

Google News requires clearly showing both the date and the time that content was published or updated. Structured data alone is not enough, though it is recommended to use in addition to a visible date and time. Date and time should be positioned between the headline and the article text. For more guidance, also see our help page about article dates.

If an article has been substantially changed, it can make sense to give it a fresh date and time. However, don’t artificially freshen a story without adding significant information or some other compelling reason for the freshening. Also, do not create a very slightly updated story from one previously published, then delete the old story and redirect to the new one. That’s against our article URLs guidelines.

More best practices for dates on web pages

In addition to the most important requirements listed above, here are additional best practices to help Google determine the best page to consider showing for a web page:

  • Show when a page has been updated: If you update a page significantly, also update the visible date (and time, if you display that). If desired, you can show two dates: when a page was originally published and when it was updated. Just do so in a way that’s visually clear to your readers. If showing both dates, it’s also highly recommended to use datePublished and dateModified for AMP or non-AMP pages to make it easier for algorithms to recognize.
  • Use the right time zone: If specifying a time, make sure to provide the correct timezone, taking into account daylight saving time as appropriate.
  • Be consistent in usage. Within a page, make sure to use exactly the same date (and, potentially, time) in structured data as well as in the visible part of the page. Make sure to use the same timezone if you specify one on the page.
  • Don’t use future dates or dates related to what a page is about: Always use a date for when a page itself was published or updated, not a date linked to something like an event that the page is writing about, especially for events or other subjects that happen in the future (you may use Event markup separately, if appropriate).
  • Follow Google’s structured data guidelines: While Google doesn’t guarantee that a date (or structured data in general) specified on a page will be used, following our structured data guidelines does help our algorithms to have it available in a machine-readable way.
  • Troubleshoot by minimizing other dates on the page: If you’ve followed the best practices above and find incorrect dates are being selected, consider if you can remove or minimize other dates that may appear on the page, such as those that might be next to related stories.

We hope these guidelines help to make it easier to specify the right date on your website’s pages! For questions or comments on this, or other structured data topics, feel free to drop by our webmaster help forums.

Posted by John Mueller, Developer Advocate, Zurich

Help Google Search know the best date for your web page

Sometimes, Google shows dates next to listings in its search results. In this post, we’ll answer some commonly-asked questions webmasters have about how these dates are determined and provide some best practices to help improve their accuracy.

How dates are determined

Google shows the date of a page when its automated systems determine that it would be relevant to do so, such as for pages that can be time-sensitive, including news content:

Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the publisher through structured markup.

Google doesn’t depend on one single factor because all of them can be prone to issues. Publishers may not always provide a clear visible date. Sometimes, structured data may be lacking or may not be adjusted to the correct time zone. That’s why our systems look at several factors to come up with what we consider to be our best estimate of when a page was published or significantly updated.

How to specify a date on a page

To help Google to pick the right date, site owners and publishers should:

  • Show a clear date: Show a visible date prominently on the page.
  • Use structured data: Use the datePublished and dateModified schema with the correct time zone designator for AMP or non-AMP pages. When using structured data, make sure to use the ISO 8601 format for dates.

Guidelines specific to Google News

Google News requires clearly showing both the date and the time that content was published or updated. Structured data alone is not enough, though it is recommended to use in addition to a visible date and time. Date and time should be positioned between the headline and the article text. For more guidance, also see our help page about article dates.

If an article has been substantially changed, it can make sense to give it a fresh date and time. However, don’t artificially freshen a story without adding significant information or some other compelling reason for the freshening. Also, do not create a very slightly updated story from one previously published, then delete the old story and redirect to the new one. That’s against our article URLs guidelines.

More best practices for dates on web pages

In addition to the most important requirements listed above, here are additional best practices to help Google determine the best page to consider showing for a web page:

  • Show when a page has been updated: If you update a page significantly, also update the visible date (and time, if you display that). If desired, you can show two dates: when a page was originally published and when it was updated. Just do so in a way that’s visually clear to your readers. If showing both dates, it’s also highly recommended to use datePublished and dateModified for AMP or non-AMP pages to make it easier for algorithms to recognize.
  • Use the right time zone: If specifying a time, make sure to provide the correct timezone, taking into account daylight saving time as appropriate.
  • Be consistent in usage. Within a page, make sure to use exactly the same date (and, potentially, time) in structured data as well as in the visible part of the page. Make sure to use the same timezone if you specify one on the page.
  • Don’t use future dates or dates related to what a page is about: Always use a date for when a page itself was published or updated, not a date linked to something like an event that the page is writing about, especially for events or other subjects that happen in the future (you may use Event markup separately, if appropriate).
  • Follow Google’s structured data guidelines: While Google doesn’t guarantee that a date (or structured data in general) specified on a page will be used, following our structured data guidelines does help our algorithms to have it available in a machine-readable way.
  • Troubleshoot by minimizing other dates on the page: If you’ve followed the best practices above and find incorrect dates are being selected, consider if you can remove or minimize other dates that may appear on the page, such as those that might be next to related stories.

We hope these guidelines help to make it easier to specify the right date on your website’s pages! For questions or comments on this, or other structured data topics, feel free to drop by our webmaster help forums.

Posted by John Mueller, Developer Advocate, Zurich

Notifying users of unclear subscription pages

Every month, millions of Chrome users encounter pages with insufficient mobile subscription information. Surprising charges that come from unclear communication are a poor user experience. That’s why starting from Chrome 71 (December 2018), Chrome will show a warning before these pages, so that users can make informed decisions when signing up to mobile based subscription services. Users will be offered the choice to proceed to the page or go back if they were unaware that they were entering a billing page.

Unclear mobile subscriptions

Picture this: Andrea is browsing the web on a mobile connection to access a gaming page and they’re presented with a page that asks them for their mobile phone details.
They fill in the blanks with their mobile number and press Continue, and get access to the content.
The next month, the phone bill arrives and they see a charge they were not expecting. Was the subscription to the online gaming service really that expensive? Did they really agree to pay that specific price for the service? How much did they agree to be charged to access the content?

Clearer billing information for Chrome users

We want to make sure Chrome users understand when they are going through a billing flow and trust that they’ll be able to make informed decisions while browsing the web.
To adequately inform users, it’s important to provide a sufficient level of details within the billing page as outlined by our new mobile billing charges best practices. Pages that answer positively to the following questions generally provide sufficient information for users:
  • Is the billing information visible and obvious to users? For example, adding no subscription information on the subscription page or hiding the information is a bad start because users should have access to the information when agreeing to subscribe.
  • Can customers easily see the costs they’re going to incur before accepting the terms? For example, displaying the billing information in grey characters over a grey background, therefore making it less readable, is not considered a good user practice.
  • Is the fee structure easily understandable? For example, the formula presented to explain how the cost of the service will be determined should be as simple and straightforward as possible.
If Chrome detects pages that don’t provide sufficient billing information to users, the following warning will be displayed to the user on Chrome mobile, Chrome desktop and Android’s WebView:
The warning will be shown to users entering unclear billing pages.
When we identify such pages, we will notify the webmaster through Search Console where there will be an option to let us know about the changes they’ve made to clarify the billing process. For websites that aren’t verified on Search Console, we will do our best to get in touch with the webmasters affected and will be available to answer questions in our public support forum available in 15 languages. Once an appeal has been sent via Search Console, we will review the changes and remove the warning accordingly.

If your billing service takes users through a clearly visible and understandable billing process as described in our best practices, you don’t need to make any changes. Also, the new warning in Chrome doesn’t impact your website’s ranking in Google Search.

If you have any questions, please come and have a chat with us in the Webmaster Help Forum.
Posted by Emily Schechter‎, Chrome Security, Giacomo Gnecchi Ruscone & Badr Salmi El Idrissi, Trust & Safety

How we fought webspam – Webspam Report 2017




We always want to make sure that when you use Google Search to find information, you get the highest quality results. But, we are aware of many bad actors who are trying to manipulate search ranking and profit from it, which is at odds with our core mission: to organize the world’s information and make it universally accessible and useful. Over the years, we’ve devoted a huge effort toward combating abuse and spam on Search. Here’s a look at how we fought abuse in 2017.

We call these various types of abuse that violate the webmaster guidelines “spam.” Our evaluation indicated that for many years, less than 1 percent of search results users visited are spammy. In the last couple of years, we’ve managed to further reduce this by half.




Google webspam trends and how we fought webspam in 2017



As we continued to improve, spammers also evolved. One of the trends in 2017 was an increase in website hacking—both for spamming search ranking and for spreading malware. Hacked websites are serious threats to users because hackers can take complete control of a site, deface homepages, erase relevant content, or insert malware and harmful code. They may also record keystrokes, stealing login credentials for online banking or financial transactions. In 2017 we focused on reducing this threat, and were able to detect and remove from search results more than 80 percent of these sites. But hacking is not just a spam problem for search users—it affects the owners of websites as well. To help website owners keep their websites safe, we created a hands-on resource to help webmasters strengthen their websites’ security and revamped our help resources to help webmasters recover from a hacked website. The guides are available in 19 languages.

We’re also recognizing the importance of robust content management systems (CMSs). A large percentage of websites are run on one of several popular CMSs, and subsequently spammers exploited them by finding ways to abuse their provisions for user-generated content, such as posting spam content in comment sections or forums. We’re working closely with many of the providers of popular content management systems like WordPress and Joomla to help them also fight spammers that abuse their forums, comment sections and websites.

Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.



Working with users and webmasters for a better web


We’re here to listen: Our automated systems are constantly working to detect and block spam. Still, we always welcome hearing from you when something seems … phishy. Last year, we were able to take action on nearly 90,000 user reports of search spam.


Reporting spam, malware and other issues you find helps us protect the site owner and other searchers from this abuse. You can file a spam report, a phishing report or a malware report. We very much appreciate these reports—a big THANK YOU to all of you who submitted them.


We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.

Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites’ Google Search presence more easily.

Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.


We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!


Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.

Posted by Cody Kwok, Principal Engineer

Our goal: helping webmasters and content creators

Great websites are the result of the hard work of website owners who make their content and services accessible to the world. Even though it’s simpler now to run a website than it was years ago, it can still feel like a complex undertaking. This is why we invest a lot of time and effort in improving Google Search so that website owners can spend more time focusing on building the most useful content for their users, while we take care of helping users find that content. 

Most website owners find they don’t have to worry much about what Google is doing—they post their content, and then Googlebot discovers, crawls, indexes and understands that content, to point users to relevant pages on those sites. However, sometimes the technical details still matter, and sometimes a great deal.

For those times when site owners would like a bit of help from someone at Google, or an explanation for why something works a particular way, or why things appear in a particular way, or how to fix what looks like a technical glitch, we have a global team dedicated to making sure there are many places for a website owner to get help from Google and knowledgeable members of the community.

The first place to start for help is Google Webmasters, a place where all of our support resources (many of which are available in 40 languages) are within easy reach:

Our second path to getting help is through our Google Webmaster Central Help Forums. We have forums in 16 languages—in English, Spanish, Hindi, French, Italian, Portuguese, Japanese, German, Russian, Turkish, Polish, Bahasa Indonesia, Thai, Vietnamese, Chinese and Korean. The forums are staffed with dedicated Googlers who are there to make sure your questions get answered. Aside from the Googlers who monitor the forums, there is an amazing group of Product Experts who generously offer their time to help other members of the community—many times providing greater detail and analysis for a particular website’s content than we could. The forums allow for both a public discussion and, if the case requires it, for private follow-up replies in the forum.

A third path for support to website owners is our series of Online Webmaster Office Hours — in English, German, Japanese, Turkish, Hindi and French. Anyone who joins these is welcome to ask us questions about website appearance in Google Search, which we will answer to the best of our abilities. All of our team members think that one of the best parts of speaking at conferences and events is the opportunity to answer questions from the audience,  and the online office hours format creates that opportunity for many more people who might not be able to travel to a specialized event. You can always check out the Google Webmaster calendar for upcoming webmaster officer hours and live events.

While how a website behaves on the web is openly visible to all who can see it, we know that some website owners prefer not to make it known their website has a problem in a public forum. There’s no shame in asking for support, but if you have an issue for your website that seems sensitive—for which you don’t think you can share all the details publicly—you can call out that you would prefer to share necessary details only with someone experienced and who is willing to help, using the forum’s “Private Reply” feature.

Are there other things you think we should be doing that would help your website get the most out of search? Please let us know — in our forums, our office hours, or via Twitter @googlewmc.

Posted by Juan Felipe Rincón from Google’s Webmaster Outreach & Support team

We updated our job posting guidelines

Last year, we launched job search on Google to connect more people with jobs. When you provide Job Posting structured data, it helps drive more relevant traffic to your page by connecting job seekers with your content. To ensure that job seekers are getting the best possible experience, it’s important to follow our Job Posting guidelines.

We’ve recently made some changes to our Job Posting guidelines to help improve the job seeker experience.

  • Remove expired jobs
  • Place structured data on the job’s detail page
  • Make sure all job details are present in the job description

Remove expired jobs

When job seekers put in effort to find a job and apply, it can be very discouraging to discover that the job that they wanted is no longer available. Sometimes, job seekers only discover that the job posting is expired after deciding to apply for the job. Removing expired jobs from your site may drive more traffic because job seekers are more confident when jobs that they visit on your site are still open for application. For more information on how to remove a job posting, see Remove a job posting.

Place structured data on the job’s detail page

Job seekers find it confusing when they land on a list of jobs instead of the specific job’s detail page. To fix this, put structured data on the most detailed leaf page possible. Don’t add structured data to pages intended to present a list of jobs (for example, search result pages) and only add it to the most specific page describing a single job with its relevant details.

Make sure all job details are present in the job description

We’ve also noticed that some sites include information in the JobPosting structured data that is not present anywhere in the job posting. Job seekers are confused when the job details they see in Google Search don’t match the job description page. Make sure that the information in the JobPosting structured data always matches what’s on the job posting page. Here are some examples:

  • If you add salary information to the structured data, then also add it to the job posting. Both salary figures should match.
  • The location in the structured data should match the location in the job posting.

Providing structured data content that is consistent with the content of the job posting pages not only helps job seekers find the exact job that they were looking for, but may also drive more relevant traffic to your job postings and therefore increase the chances of finding the right candidates for your jobs.

If your site violates the Job Posting guidelines (including the guidelines in this blog post), we may take manual action against your site and it may not be eligible for display in the jobs experience on Google Search. You can submit a reconsideration request to let us know that you have fixed the problem(s) identified in the manual action notification. If your request is approved, the manual action will be removed from your site or page.

For more information, visit our Job Posting developer documentation and our JobPosting FAQ.

Posted by Anouar Bendahou, Trust & Safety Search Team

A reminder about links in large-scale article campaigns

Lately we’ve seen an increase in spammy links contained in articles referred to as contributor posts, guest posts, partner posts, or syndicated posts. These articles are generally written by or in the name of one website, and published on a different one.
Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. Below are factors that, when taken to an extreme, can indicate when an article is in violation of these guidelines:
  • Stuffing keyword-rich links to your site in your articles
  • Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites
  • Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on
  • Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)
When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?

For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form. And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything–including links–will follow (no pun intended).

Posted by the Google Webspam Team

Protect your site from user generated spam

As a website owner, you might have come across some auto-generated content in comments sections or forum threads. When such content is created on your pages, not only does it disrupt those visiting your site, but it also shows some content that you may not want to be associated with your site to Google and other search engines.

In this blog post, we will give you tips to help you deal with this type of spam in your site and forum.

Some spammers abuse sites owned by others by posting deceiving content and links, in an attempt to get more traffic to their sites. Here are a few examples:


Comments and forum threads can be a really good source of information and an efficient way of engaging a site’s users in discussions. This valuable content should not be buried by auto-generated keywords and links placed there by spammers.

There are many ways of securing your site’s forums and comment threads and making them unattractive to spammers:

  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.

  • Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they’re a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha .
  • Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.
  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.
  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.
    If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.
  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.
    If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they’re publicly visible.
    Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum’s new users by looking on their posts and activities on your forum.  
  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.
  • Use the “nofollow” attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you.

For detailed information about these topics, check out our Help Center document on User Generated Spam and comment spam. You can also visit our Webmaster Central Help Forum if you need any help.

Posted by Anouar Bendahou, Search Quality Strategist, Google Ireland

Protect your site from user generated spam

As a website owner, you might have come across some auto-generated content in comments sections or forum threads. When such content is created on your pages, not only does it disrupt those visiting your site, but it also shows some content that you may not want to be associated with your site to Google and other search engines.

In this blog post, we will give you tips to help you deal with this type of spam in your site and forum.

Some spammers abuse sites owned by others by posting deceiving content and links, in an attempt to get more traffic to their sites. Here are a few examples:


Comments and forum threads can be a really good source of information and an efficient way of engaging a site’s users in discussions. This valuable content should not be buried by auto-generated keywords and links placed there by spammers.

There are many ways of securing your site’s forums and comment threads and making them unattractive to spammers:

  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.

  • Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they’re a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha .
  • Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.
  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.
  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.
    If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.
  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.
    If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they’re publicly visible.
    Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum’s new users by looking on their posts and activities on your forum.  
  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.
  • Use the “nofollow” attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you.

For detailed information about these topics, check out our Help Center document on User Generated Spam and comment spam. You can also visit our Webmaster Central Help Forum if you need any help.

Posted by Anouar Bendahou, Search Quality Strategist, Google Ireland

A reminder about widget links

Google has long taken a strong stance against links that manipulate a site’s PageRank. Today we would like to reiterate our policy on the creation of keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites.

Widgets can help website owners enrich the experience of their site and engage users. However, some widgets add links to a site that a webmaster did not editorially place and contain anchor text that the webmaster does not control. Because these links are not naturally placed, they’re considered a violation of Google Webmaster Guidelines.

Below you can find the examples of widgets which contain links that violate Google Webmaster Guidelines:



Google’s webspam team may take manual actions on unnatural links. When a manual action is taken, Google will notify the site owners through Search Console. If you receive such a warning for unnatural links to your site and you use links in widgets to promote your site, we recommend resolving these issues and requesting reconsideration.

You can resolve issues with unnatural links by making sure they don’t pass PageRank. To do this, add a rel=”nofollow” attribute on the widget links or remove the links entirely. After fixing or removing widget links and any other unnatural links to your site, let Google know about your change by submitting a reconsideration request in Search Console. Once the request has been reviewed, you’ll get a notification about whether the reconsideration request was successful or not.

Also, we would like to remind webmasters who use widgets on their sites to check those widgets for any unnatural links. Add a rel=”nofollow” attribute on those unnatural links or remove the links entirely from the widget.

For more information, please watch our video about widget links and refer to our Webmaster Guidelines on Link Schemes. Additionally, feel free to ask questions in our Webmaster Help Forums, where a community of webmasters can help with their experience.

Posted by Agnieszka Łata, Trust & Safety Search Team and Eric Kuan, Webmaster Relations Specialist

A reminder about widget links

Google has long taken a strong stance against links that manipulate a site’s PageRank. Today we would like to reiterate our policy on the creation of keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites.

Widgets can help website owners enrich the experience of their site and engage users. However, some widgets add links to a site that a webmaster did not editorially place and contain anchor text that the webmaster does not control. Because these links are not naturally placed, they’re considered a violation of Google Webmaster Guidelines.

Below you can find the examples of widgets which contain links that violate Google Webmaster Guidelines:



Google’s webspam team may take manual actions on unnatural links. When a manual action is taken, Google will notify the site owners through Search Console. If you receive such a warning for unnatural links to your site and you use links in widgets to promote your site, we recommend resolving these issues and requesting reconsideration.

You can resolve issues with unnatural links by making sure they don’t pass PageRank. To do this, add a rel=”nofollow” attribute on the widget links or remove the links entirely. After fixing or removing widget links and any other unnatural links to your site, let Google know about your change by submitting a reconsideration request in Search Console. Once the request has been reviewed, you’ll get a notification about whether the reconsideration request was successful or not.

Also, we would like to remind webmasters who use widgets on their sites to check those widgets for any unnatural links. Add a rel=”nofollow” attribute on those unnatural links or remove the links entirely from the widget.

For more information, please watch our video about widget links and refer to our Webmaster Guidelines on Link Schemes. Additionally, feel free to ask questions in our Webmaster Help Forums, where a community of webmasters can help with their experience.

Posted by Agnieszka Łata, Trust & Safety Search Team and Eric Kuan, Webmaster Relations Specialist

Best practices for bloggers reviewing free products they receive from companies

As a form of online marketing, some companies today will send bloggers free products to review or give away in return for a mention in a blogpost. Whether you’re the company supplying the product or the blogger writing the post, below are a few best practices to ensure that this content is both useful to users and compliant with Google Webmaster Guidelines.

  1. Use the nofollow tag where appropriate

    Links that pass PageRank in exchange for goods or services are against Google guidelines on link schemes. Companies sometimes urge bloggers to link back to:

    1. the company’s site
    2. the company’s social media accounts
    3. an online merchant’s page that sells the product
    4. a review service’s page featuring reviews of the product
    5. the company’s mobile app on an app store

    Bloggers should use the nofollow tag on all such links because these links didn’t come about organically (i.e., the links wouldn’t exist if the company hadn’t offered to provide a free good or service in exchange for a link). Companies, or the marketing firms they’re working with, can do their part by reminding bloggers to use nofollow on these links.

  2. Disclose the relationship

    Users want to know when they’re viewing sponsored content. Also, there are laws in some countries that make disclosure of sponsorship mandatory. A disclosure can appear anywhere in the post; however, the most useful placement is at the top in case users don’t read the entire post.

  3. Create compelling, unique content

    The most successful blogs offer their visitors a compelling reason to come back. If you’re a blogger you might try to become the go-to source of information in your topic area, cover a useful niche that few others are looking at, or provide exclusive content that only you can create due to your unique expertise or resources.

For more information, please drop by our Google Webmaster Central Help Forum.

Posted by the Google Webspam Team

Detect and get rid of unwanted sneaky mobile redirects

In many cases, it is OK to show slightly different content on different devices. For example, optimizing the smaller space of a smartphone screen can mean that some content, like images, will have to be modified. Or you might want to store your website’s menu in a navigation drawer (find documentation here) to make mobile browsing easier and more effective. When implemented properly, these user-centric modifications can be understood very well by Google.

The situation is similar when it comes to mobile-only redirect. Redirecting mobile users to improve their mobile experience (like redirecting mobile users from example.com/url1 to m.example.com/url1) is often beneficial to them. But redirecting mobile users sneakily to a different content is bad for user experience and is against Google’s webmaster guidelines.


A frustrating experience: The same URL shows up in search results pages on desktop and on mobile. When a user clicks on this result on their desktop computer, the URL opens normally. However, when clicking on the same result on a smartphone, a redirect happens and an unrelated URL loads.

Who implements these mobile-only sneaky redirects?

There are cases where webmasters knowingly decide to put into place redirection rules for their mobile users. This is typically a webmaster guidelines violation, and we do take manual action against it when it harms Google users’ experience (see last section of this article).   

But we’ve also observed situations where mobile-only sneaky redirects happen without site owners being aware of it:

  • Advertising schemes that redirect mobile users specifically
    A script/element installed to display ads and monetize content might be redirecting mobile users to a completely different site without the webmaster being aware of it.
  • Mobile redirect as a result of the site being a target of hacking
    In other cases, if your website has been hacked, a potential result can be redirects to spammy domains for mobile users only.

How do I detect if my site is doing sneaky mobile redirects?

  1. Check if you are redirected when you navigate to your site on your smartphone
    We recommend you to check the mobile user experience of your site by visiting your pages from Google search results with a smartphone. When debugging, mobile emulation in desktop browsers is handy, mostly because you can test for many different devices. You can, for example, do it straight from your browser in Chrome, Firefox or Safari (for the latter, make sure you have enabled the “Show Develop menu in menu bar” feature).
  1. Listen to your users
    Your users could see your site in a different way than you do. It’s always important to pay attention to user complaints, so you can hear of any issue related to mobile UX.
  2. Monitor your users in your site’s analytics data
    Unusual mobile user activity could be detected by looking at some of the data held in your website’s analytics data. For example, looking at the average time spent on your site by your mobile users could be a good signal to watch: if all of a sudden, your mobile users (and only them) start spending much less time on your site than they used to, there might be an issue related to mobile redirections.

    To be aware of wide changes in mobile user activity as soon as they happen, you can for example set up Google Analytics alerts. For example, you can set an alert to be warned in case of a sharp drop in average time spent on your site by mobile users, or a drop in mobile users (always take into account that big changes in those metrics are not a clear, direct signal that your site is doing mobile sneaky redirects).

I’ve detected sneaky redirects for my mobile users, and I did not set it up: what do I do?

  1. Make sure that your site is not hacked.
    Check the Security Issues tool in the Search Console, if we have noticed any hack, you should get some information there.
    Review our additional resources on typical symptoms of hacked sites, and our case studies on hacked sites.
  2. Audit third-party scripts/elements on your site
    If your site is not hacked, then we recommend you take the time to investigate if third-party scripts/elements are causing the redirects. You can follow these steps:
    A. Remove one by one the third-party scripts/elements you do not control from the redirecting page(s).
    B. Check your site on a mobile device or through emulation between each script/element removal, and see when the redirect stops.
    C. If you think a particular script/element is responsible for the sneaky redirect, consider removing it from your site, and debugging the issue with the script/element provider.

Last Thoughts on Sneaky Mobile Redirects

It’s a violation of the Google Webmaster Guidelines to redirect a user to a page with the intent of displaying content other than what was made available to the search engine crawler (more information on sneaky redirects). To ensure quality search results for our users, the Google Search Quality team can take action on such sites, including removal of URLs from our index.  When we take manual action, we send a message to the site owner via Search Console. Therefore, make sure you’ve set up a Search Console account.

Be sure to choose advertisers who are transparent on how they handle user traffic, to avoid unknowingly redirecting your own users. If you are interested in trust-building in the online advertising space, you may check out industry-wide best practices when participating in ad networks. For example, the Trustworthy Accountability Group’s (Interactive Advertising Bureau) Inventory Quality Guidelines are a good place to start. There are many ways to monetize your content with mobile solutions that provide a high quality user experience, be sure to use them.

If you have questions or comments about mobile-only redirects, join us in our Google Webmaster Support forum.

Written by Vincent Courson & Badr Salmi El Idrissi, Search Quality team

An update on how we tackle hacked spam

Recently we have started rolling out a series of algorithmic changes that aim to tackle hacked spam in our search results. A huge amount of legitimate sites are hacked by spammers and used to engage in abusive behavior, such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs, etc.

Website owners that don’t implement standard best practices for security can leave their websites vulnerable to being easily hacked. This can include government sites, universities, small business, company websites, restaurants, hobby organizations, conferences, etc. Spammers and cyber-criminals purposely seek out those sites and inject pages with malicious content in an attempt to gain rank and traffic in search engines.

We are aggressively targeting hacked spam in order to protect users and webmasters.

The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. As we roll out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown:

This is due to the large amount of hacked spam being removed, and should improve in the near future. We are continuing tuning our systems to weed out the bad content while retaining the organic, legitimate results. If you have any questions about these changes, or want to give us feedback on these algorithms, feel free to drop by our Webmaster Help Forums.

Posted by Ning Song, Software Engineer

First Click Free update

Around ten years ago when we introduced a policy called “First Click Free,” it was hard to imagine that the always-on, multi-screen, multiple device world we now live in would change content consumption so much and so fast. The spirit of the First Click Free effort was – and still is – to help users get access to high quality news with a minimum of effort, while also ensuring that publishers with a paid subscription model get discovered in Google Search and via Google News.

In 2009, we updated the FCF policy to allow a limit of five articles per day, in order to protect publishers who felt some users were abusing the spirit of this policy. Recently we have heard from publishers about the need to revisit these policies to reflect the mobile, multiple device world. Today we are announcing a change to the FCF limit to allow a limit of three articles a day. This change will be valid on both Google Search and Google News.

Google wants to play its part in connecting users to quality news and in connecting publishers to users. We believe the FCF is important in helping achieve that goal, and we will periodically review and update these policies as needed so they continue to benefit users and publishers alike. We are listening and always welcome feedback.

Questions and answers about First Click Free

Q: Do the rest of the old guidelines still apply?
A: Yes, please check the guidelines for Google News as well as the guidelines for Web Search and the associated blog post for more information.

Q: Can I apply First Click Free to only a section of my site / only for Google News (or only for Web Search)?
A: Sure! Just make sure that both Googlebot and users from the appropriate search results can view the content as required. Keep in mind that showing Googlebot the full content of a page while showing users a registration page would be considered cloaking.

Q: Do I have to sign up to use First Click Free?
A: Please let us know about your decision to use First Click Free if you are using it for Google News. There’s no need to inform us of the First Click Free status for Google Web Search.

Q: What is the preferred way to count a user’s accesses?
A: Since there are many different site architectures, we believe it’s best to leave this up to the publisher to decide.

(Please see our related blog post for more information on First Click Free for Google News.)

Posted by John Mueller, Google Switzerland