Enabling more high quality content for users
In Google’s mission to organize the world’s information, we want to guide Google users to the highest quality content, the principle exemplified in our quality rater guidelines. Professional publishers provide the lion’s share of quality content that benefits users and we want to encourage their success.
The ecosystem is sustained via two main sources of revenue: ads and subscriptions, with the latter requiring a delicate balance to be effective in Search. Typically subscription content is hidden behind paywalls, so that users who don’t have a subscription don’t have access. Our evaluations have shown that users who are not familiar with the high quality content behind a paywall often turn to other sites offering free content. It is difficult to justify a subscription if one doesn’t already know how valuable the content is, and in fact, our experiments have shown that a portion of users shy away from subscription sites. Therefore, it is essential that sites provide some amount of free sampling of their content so that users can learn how valuable their content is.
The First Click Free (FCF) policy for both Google web search and News was designed to address this issue. It offers promotion and discovery opportunities for publishers with subscription content, while giving Google users an opportunity to discover that content. Over the past year, we have worked with publishers to investigate the effects of FCF on user satisfaction and on the sustainability of the publishing ecosystem. We found that while FCF is a reasonable sampling model, publishers are in a better position to determine what specific sampling strategy works best for them. Therefore, we are removing FCF as a requirement for Search, and we encourage publishers to experiment with different free sampling schemes, as long as they stay within the updated webmaster guidelines. We call this Flexible Sampling.
One of the original motivations for FCF is to address the issues surrounding cloaking, where the content served to Googlebot is different from the content served to users. Spammers often seek to game search engines by showing interesting content to the search engine, say healthy food recipes, but then showing users an offer for diet pills. This “bait and switch” scheme creates a bad user experience since users do not get the content they expected. Sites with paywalls are strongly encouraged to apply the new structured data to their pages, because without it, the paywall may be interpreted as a form of cloaking, and the pages would then be removed from search results.
Based on our investigations, we have created detailed best practices for implementing flexible sampling. There are two types of sampling we advise: metering, which provides users with a quota of free articles to consume, after which paywalls will start appearing; and lead-in, which offers a portion of an article’s content without it being shown in full.
For metering, we think that monthly (rather than daily) metering provides more flexibility and a safer environment for testing. The user impact of changing from one integer value to the next is less significant at, say, 10 monthly samples than at 3 daily samples. All publishers and their audiences are different, so there is no single value for optimal free sampling across publishers. However, we recommend that publishers start by providing 10 free clicks per month to Google search users in order to preserve a good user experience for new potential subscribers. Publishers should then experiment to optimize the tradeoff between discovery and conversion that works best for their businesses.
Lead-in is generally implemented as truncated content, such as the first few sentences or 50-100 words of the article. Lead-in allows users a taste of how valuable the content may be. Compared to a page with completely blocked content, lead-in clearly provides more utility and added value to users.
We are excited by this change as it allows the growth of the premium content ecosystem, which ultimately benefits users. We look forward to the prospect of serving users more high quality content!
Posted by Cody Kwok, Principal Engineer
How to move from m-dot URLs to responsive site
With more sites moving towards responsive web design, many webmasters have questions about migrating from separate mobile URLs, also frequently known as “m-dot URLs”, to using responsive web design. Here are some recommendations on how to move from separate urls to one responsive URL in a way that gives your sites the best chance of performing well on Google’s search results.
Moving to responsive sites in a Googlebot-friendly way
Once you have your responsive site ready, moving is something you can definitely do with just a bit of forethought. Considering your URLs stay the same for desktop version, all you have to do is to configure 301 redirects from the mobile URLs to the responsive web URLs.
Here are the detailed steps:
- Get your responsive site ready
- Configure 301 redirects on the old mobile URLs to point to the responsive versions (the new pages). These redirects need to be done on a per-URL basis, individually from each mobile URLs to the responsive URLs.
- Remove any mobile-URL specific configuration your site might have, such as conditional redirects or a vary HTTP header.
- As a good practice, setup rel=canonical on the responsive URLs pointing to themselves (self-referential canonicals).
If you’re currently using dynamic serving and want to move to responsive design, you don’t need to add or change any redirects.
Some benefits for moving to responsive web design
Moving to a responsive site should make maintenance and reporting much easier for you down the road. Aside from no longer needing to manage separate URLs for all pages, it will also make it much easier to adopt practices and technologies such as hreflang for internationalization, AMP for speed, structured data for advanced search features and more.
As always, if you need more help you can ask a question in our webmaster forum.
Posted by Cherry Prommawin, Webmaster Relations
How to move from m-dot URLs to responsive site
With more sites moving towards responsive web design, many webmasters have questions about migrating from separate mobile URLs, also frequently known as “m-dot URLs”, to using responsive web design. Here are some recommendations on how to move from separate urls to one responsive URL in a way that gives your sites the best chance of performing well on Google’s search results.
Moving to responsive sites in a Googlebot-friendly way
Once you have your responsive site ready, moving is something you can definitely do with just a bit of forethought. Considering your URLs stay the same for desktop version, all you have to do is to configure 301 redirects from the mobile URLs to the responsive web URLs.
Here are the detailed steps:
- Get your responsive site ready
- Configure 301 redirects on the old mobile URLs to point to the responsive versions (the new pages). These redirects need to be done on a per-URL basis, individually from each mobile URLs to the responsive URLs.
- Remove any mobile-URL specific configuration your site might have, such as conditional redirects or a vary HTTP header.
- As a good practice, setup rel=canonical on the responsive URLs pointing to themselves (self-referential canonicals).
If you’re currently using dynamic serving and want to move to responsive design, you don’t need to add or change any redirects.
Some benefits for moving to responsive web design
Moving to a responsive site should make maintenance and reporting much easier for you down the road. Aside from no longer needing to manage separate URLs for all pages, it will also make it much easier to adopt practices and technologies such as hreflang for internationalization, AMP for speed, structured data for advanced search features and more.
As always, if you need more help you can ask a question in our webmaster forum.
Posted by Cherry Prommawin, Webmaster Relations
Introducing Our New International Webmaster Blogs!
Join us in welcoming the latest additions to the Webmasters community:
नमस्ते Webmasters in Hindi!
Добро Пожаловать Webmasters in Russian!
Hoşgeldiniz Webmasters in Turkish!
สวัสดีค่ะ Webmasters in Thai!
xin chào Webmasters in Vietnamese!
We will be sharing webmaster-related updates in our current and new blogs to make sure you have a place to follow the latest launches, updates and changes in Search in your languages! We will share links to relevant Help resources, educational content and events as they become available.
Just a reminder, here are some of the resources that we have available in multiple languages:
- Google.com/webmasters – documentation, support channels, tools (including a link to Search Console) and learning materials.
- Help Center – tips and tutorials on using Search Console, answers to frequently asked questions and step-by-step guides.
- Help forum – ask your questions and get advice from the Webmaster community
- YouTube Channel – recordings of Hangouts on Air in different languages are on our
- G+ community – another place we announce and share our Hangouts On Air
Testing tools:
- PageSpeed insights – actionable insights on how to increase your site’s performance
- Mobile-Friendly test – identify areas where you can improve your site’s performance on Mobile devices
- Structure Data testing tool – preview and test your Structured Data markup
Some other valuable resources (English-only):
- Developer documentation on Search – a great resource where you can find feature guides, code labs, videos and links to more useful tools for webmasters.
If you have webmaster-specific questions, check our event calendar for the next hangout session or live event! Alternatively, you can post your questions to one of the local help forum, where our talented Product Experts from the TC program will try to answer your questions. Our Experts are product enthusiasts who have earned the distinction of “Top Contributor,” or “Rising Star,” by sharing their knowledge on the Google Help Forums.
If you have suggestions, please let us know in the comments below. We look forward to working with you in your language!
Introducing Our New International Webmaster Blogs!
Join us in welcoming the latest additions to the Webmasters community:
नमस्ते Webmasters in Hindi!
Добро Пожаловать Webmasters in Russian!
Hoşgeldiniz Webmasters in Turkish!
สวัสดีค่ะ Webmasters in Thai!
xin chào Webmasters in Vietnamese!
We will be sharing webmaster-related updates in our current and new blogs to make sure you have a place to follow the latest launches, updates and changes in Search in your languages! We will share links to relevant Help resources, educational content and events as they become available.
Just a reminder, here are some of the resources that we have available in multiple languages:
- Google.com/webmasters – documentation, support channels, tools (including a link to Search Console) and learning materials.
- Help Center – tips and tutorials on using Search Console, answers to frequently asked questions and step-by-step guides.
- Help forum – ask your questions and get advice from the Webmaster community
- YouTube Channel – recordings of Hangouts on Air in different languages are on our
- G+ community – another place we announce and share our Hangouts On Air
Testing tools:
- PageSpeed insights – actionable insights on how to increase your site’s performance
- Mobile-Friendly test – identify areas where you can improve your site’s performance on Mobile devices
- Structure Data testing tool – preview and test your Structured Data markup
Some other valuable resources (English-only):
- Developer documentation on Search – a great resource where you can find feature guides, code labs, videos and links to more useful tools for webmasters.
If you have webmaster-specific questions, check our event calendar for the next hangout session or live event! Alternatively, you can post your questions to one of the local help forum, where our talented Product Experts from the TC program will try to answer your questions. Our Experts are product enthusiasts who have earned the distinction of “Top Contributor,” or “Rising Star,” by sharing their knowledge on the Google Help Forums.
If you have suggestions, please let us know in the comments below. We look forward to working with you in your language!
The new Search Console: a sneak peek at two experimental features
Now we have decided to embark on an extensive redesign to better serve you, our users. Our hope is that this redesign will provide you with:
- More actionable insights – We will now group the identified issues by what we suspect is the common “root-cause” to help you find where you should fix your code. We organize these issues into tasks that have a state (similar to bug tracking systems) so you can easily see whether the issue is still open, whether Google has detected your fix, and track the progress of re-processing the affected pages.
- Better support of your organizational workflow – As we talked to many organizations, we’ve learned that multiple people are typically involved in implementing, diagnosing, and fixing issues. This is why we are introducing sharing functionality that allows you to pick-up an action item and share it with other people in your group, like developers who will get references to the code in question.
- Faster feedback loops between you and Google – We’ve built a mechanism to allow you to iterate quickly on your fixes, and not waste time waiting for Google to recrawl your site, only to tell you later that it’s not fixed yet. Rather, we’ll provide on-the-spot testing of fixes and are automatically speeding up crawling once we see things are ok. Similarly, the testing tools will include code snippets and a search preview – so you can quickly see where your issues are, confirm you’ve fixed them, and see how the pages will look on Search.
In the next few weeks, we’re releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.
The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.
Here’s a peek of our new Index Coverage report:
The new AMP fixing flow
The new AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks.
Posted by John Mueller and the Search Console Team
The new Search Console: a sneak peek at two experimental features
Now we have decided to embark on an extensive redesign to better serve you, our users. Our hope is that this redesign will provide you with:
- More actionable insights – We will now group the identified issues by what we suspect is the common “root-cause” to help you find where you should fix your code. We organize these issues into tasks that have a state (similar to bug tracking systems) so you can easily see whether the issue is still open, whether Google has detected your fix, and track the progress of re-processing the affected pages.
- Better support of your organizational workflow – As we talked to many organizations, we’ve learned that multiple people are typically involved in implementing, diagnosing, and fixing issues. This is why we are introducing sharing functionality that allows you to pick-up an action item and share it with other people in your group, like developers who will get references to the code in question.
- Faster feedback loops between you and Google – We’ve built a mechanism to allow you to iterate quickly on your fixes, and not waste time waiting for Google to recrawl your site, only to tell you later that it’s not fixed yet. Rather, we’ll provide on-the-spot testing of fixes and are automatically speeding up crawling once we see things are ok. Similarly, the testing tools will include code snippets and a search preview – so you can quickly see where your issues are, confirm you’ve fixed them, and see how the pages will look on Search.
In the next few weeks, we’re releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.
The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.
Here’s a peek of our new Index Coverage report:
The new AMP fixing flow
The new AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks.
Posted by John Mueller and the Search Console Team
Badges on Image Search help users find what they really want
When you want to bake cupcakes, but you don’t know what kind, Image Search can help you make a decision. Finding an image with a recipe can be challenging: you might end up on a page that has only pictures of these delicious things, or a cupcake fan si…
Badges on Image Search help users find what they really want
When you want to bake cupcakes, but you don’t know what kind, Image Search can help you make a decision. Finding an image with a recipe can be challenging: you might end up on a page that has only pictures of these delicious things, or a cupcake fan si…
Connect to job seekers with Google Search
July 20, 2017 update: Starting today, impressions and clicks stats for job listing pages and job details pages are available in the Search Analytics report in Search Console. Read more about how Jobs impressions and clicks are counted in the help centre. If you have questions, head to the webmaster forums.
At Google I/O this year, we announced Google for Jobs, a new company-wide initiative focused on helping both job seekers and employers, through collaboration with the job matching industry. One major part of this effort is launching an improved experience for job seekers on Google Search. We’re happy to announce this new experience is now open for all developers and site owners.
For queries with clear intent like [head of catering jobs in nyc] or [entry level jobs in DC], we’ll show a job listings preview, and each job can expand to display comprehensive details about the listing:
For employers or site owners with job content, this feature brings many benefits:
- Prominent place in Search results: your postings are eligible to be displayed in the in the new job search feature on Google, featuring your logo, reviews, ratings, and job details.
- More, motivated applicants: job seekers can filter by various criteria like location or job title, meaning you’re more likely to get applicants who are looking exactly for that job.
- Increased chances of discovery and conversion: job seekers will have a new avenue to interact with your postings and click through to your site.
Get your job listings on Google
Implementation involves two steps:
- Mark up your job listings with Job Posting structured data.
- Submit a sitemap (or an RSS or Atom feed) with a <lastmod> date for each listing.
If you have more than 100,000 job postings or more than 10,000 changes per day, you can express interest to use the High Change Rate feature.
If you already publish your job openings on another site like LinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook, they are eligible to appear in the feature as well.
Job search is an enriched search experience. We’ve created a dedicated guide to help you understand how Google ranking works for enriched search and practices for improving your presence
Keep track of how you’re doing and fix issues
There’s a suite of tools to help you with the implementation:
- Validate your markup with the Structured Data Testing Tool
- Preview your listing in the Structured Data Testing Tool
- Keep track of your sitemap status in Search Console
- See aggregate stats and markup error examples in Search Console
In the coming weeks, we’ll add new job listings filters in the Search Analytics report in Search Console, so you can track clicks and impressions for your listings.
As always, if you have questions, ask in the forums or find us on Twitter!
Posted by Nick Zakrasek, Product Manager
Connect to job seekers with Google Search
At Google I/O this year, we announced Google for Jobs, a new company-wide initiative focused on helping both job seekers and employers, through collaboration with the job matching industry. One major part of this effort is launching an improved experience for job seekers on Google Search. We’re happy to announce this new experience is now open for all developers and site owners.
For queries with clear intent like [head of catering jobs in nyc] or [entry level jobs in DC], we’ll show a job listings preview, and each job can expand to display comprehensive details about the listing:
For employers or site owners with job content, this feature brings many benefits:
- Prominent place in Search results: your postings are eligible to be displayed in the in the new job search feature on Google, featuring your logo, reviews, ratings, and job details.
- More, motivated applicants: job seekers can filter by various criteria like location or job title, meaning you’re more likely to get applicants who are looking exactly for that job.
- Increased chances of discovery and conversion: job seekers will have a new avenue to interact with your postings and click through to your site.
Get your job listings on Google
Implementation involves two steps:
- Mark up your job listings with Job Posting structured data.
- Submit a sitemap (or an RSS or Atom feed) with a <lastmod> date for each listing.
If you have more than 100,000 job postings or more than 10,000 changes per day, you can express interest to use the High Change Rate feature.
If you already publish your job openings on another site like LinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook, they are eligible to appear in the feature as well.
Job search is an enriched search experience. We’ve created a dedicated guide to help you understand how Google ranking works for enriched search and practices for improving your presence
Keep track of how you’re doing and fix issues
There’s a suite of tools to help you with the implementation:
- Validate your markup with the Structured Data Testing Tool
- Preview your listing in the Structured Data Testing Tool
- Keep track of your sitemap status in Search Console
- See aggregate stats and markup error examples in Search Console
In the coming weeks, we’ll add new job listings filters in the Search Analytics report in Search Console, so you can track clicks and impressions for your listings.
As always, if you have questions, ask in the forums or find us on Twitter!
Posted by Nick Zakrasek, Product Manager
Making the Internet safer and faster: Introducing reCAPTCHA Android API
When we launched reCAPTCHA ten years ago, we had a simple goal: enable users to visit the sites they love without worrying about spam and abuse. Over the years, reCAPTCHA has changed quite a bit. It evolved from the distorted text to street numbers and names, then No CAPTCHA reCAPTCHA in 2014 and Invisible reCAPTCHA in March this year.
By now, more than a billion users have benefited from reCAPTCHA and we continue to work to refine our protections.
reCAPTCHA protects users wherever they may be online. As the use of mobile devices has grown rapidly, it’s important to keep the mobile applications and data safe. Today, on reCAPTCHA’s tenth birthday, we’re glad to announce the first reCAPTCHA Android API as part of Google Play Services.
With this API, reCAPTCHA can better tell human and bots apart to provide a streamlined user experience on mobile. It will use our newest Invisible reCAPTCHA technology, which runs risk analysis behind the scene and has enabled millions of human users to pass through with zero click everyday. Now mobile users can enjoy their apps without being interrupted, while still staying away from spam and abuse.
reCAPTCHA Android API is included with Google SafetyNet, which provides services like device attestation and safe browsing to protect mobile apps. Mobile developers can do both the device and user attestations in the same API to mitigate security risks of their apps more efficiently. This adds to the diversity of security protections on Android: Google Play Protect to monitor for potentially harmful applications, device encryption, and regular security updates. Please visit our site to learn more about how to integrate with the reCAPTCHA Android API, and keep an eye out for our iOS library.
The journey of reCAPTCHA continues: we’ll make the Internet safer and easier to use for everyone (except bots).
Posted by Wei Liu, Product Manager, reCAPTCHA
Making the Internet safer and faster: Introducing reCAPTCHA Android API
When we launched reCAPTCHA ten years ago, we had a simple goal: enable users to visit the sites they love without worrying about spam and abuse. Over the years, reCAPTCHA has changed quite a bit. It evolved from the distorted text to street numbers and names, then No CAPTCHA reCAPTCHA in 2014 and Invisible reCAPTCHA in March this year.
By now, more than a billion users have benefited from reCAPTCHA and we continue to work to refine our protections.
reCAPTCHA protects users wherever they may be online. As the use of mobile devices has grown rapidly, it’s important to keep the mobile applications and data safe. Today, on reCAPTCHA’s tenth birthday, we’re glad to announce the first reCAPTCHA Android API as part of Google Play Services.
With this API, reCAPTCHA can better tell human and bots apart to provide a streamlined user experience on mobile. It will use our newest Invisible reCAPTCHA technology, which runs risk analysis behind the scene and has enabled millions of human users to pass through with zero click everyday. Now mobile users can enjoy their apps without being interrupted, while still staying away from spam and abuse.
reCAPTCHA Android API is included with Google SafetyNet, which provides services like device attestation and safe browsing to protect mobile apps. Mobile developers can do both the device and user attestations in the same API to mitigate security risks of their apps more efficiently. This adds to the diversity of security protections on Android: Google Play Protect to monitor for potentially harmful applications, device encryption, and regular security updates. Please visit our site to learn more about how to integrate with the reCAPTCHA Android API, and keep an eye out for our iOS library.
The journey of reCAPTCHA continues: we’ll make the Internet safer and easier to use for everyone (except bots).
Posted by Wei Liu, Product Manager, reCAPTCHA
Better Snippets for your Users
-
The content of the page
-
The meta description
-
DMOZ listings
What makes a good meta description?
What are the most common problems with meta descriptions?
Is there a character limit for meta descriptions?
What will happen with the “NOODP” robots directive?
Can I prevent Google from using the page contents as snippet?
Posted by Gary, Search Team
Better Snippets for your Users
-
The content of the page
-
The meta description
-
DMOZ listings
What makes a good meta description?
What are the most common problems with meta descriptions?
Is there a character limit for meta descriptions?
What will happen with the “NOODP” robots directive?
Can I prevent Google from using the page contents as snippet?
Posted by Gary, Search Team
A reminder about links in large-scale article campaigns
-
Stuffing keyword-rich links to your site in your articles
-
Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites
-
Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on
-
Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)
A reminder about links in large-scale article campaigns
-
Stuffing keyword-rich links to your site in your articles
-
Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites
-
Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on
-
Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)
How we fought webspam – Webspam Report 2016
With 2017 well underway, we wanted to take a moment and share some of the insights we gathered in 2016 in our fight against webspam. Over the past year, we continued to find new ways of keeping spam from creating a poor quality search experience, and worked with webmasters around the world to make the web better.
We do a lot behind the scenes to make sure that users can make full use of what today’s web has to offer, bringing relevant results to everyone around the globe, while fighting webspam that could potentially harm or simply annoy users.
Webspam trends in 2016
- Website security continues to be a major source of concern. Last year we saw more hacked sites than ever – a 32% increase compared to 2015. Because of this, we continued to invest in improving and creating more resources to help webmasters know what to do when their sites get hacked.
- We continued to see that sites are compromised not just to host webspam. We saw a lot of webmasters affected by social engineering, unwanted software, and unwanted ad injectors. We took a stronger stance in Safe Browsing to protect users from deceptive download buttons, made a strong effort to protect users from repeatedly dangerous sites, and we launched more detailed help text within the Search Console Security Issues Report.
- Since more people are searching on Google using a mobile device, we saw a significant increase in spam targeting mobile users. In particular, we saw a rise in spam that redirects users, without the webmaster’s knowledge, to other sites or pages, inserted into webmaster pages using widgets or via ad units from various advertising networks.
How we fought spam in 2016
- We continued to refine our algorithms to tackle webspam. We made multiple improvements to how we rank sites, including making Penguin (one of our core ranking algorithms) work in real-time.
- The spam that we didn’t identify algorithmically was handled manually. We sent over 9 million messages to webmasters to notify them of webspam issues on their sites. We also started providing more security notifications via Google Analytics.
- We performed algorithmic and manual quality checks to ensure that websites with structured data markup meet quality standards. We took manual action on more than 10,000 sites that did not meet the quality guidelines for inclusion in search features powered by structured data.
Working with users and webmasters for a better web
- In 2016 we received over 180,000 user-submitted spam reports from around the world. After carefully checking their validity, we considered 52% of those reported sites to be spam. Thanks to all who submitted reports and contributed towards a cleaner and safer web ecosystem!
- We conducted more than 170 online office hours and live events around the world to audiences totaling over 150,000 website owners, webmasters and digital marketers.
- We continued to provide support to website owners around the world through our Webmaster Help Forums in 15 languages. Through these forums we saw over 67,000 questions, with a majority of them being identified as having a Best Response by our community of Top contributors, Rising Stars and Googlers.
- We had 119 volunteer Webmaster Top Contributors and Rising Stars, whom we invited to join us at our local Top Contributor Meetups in 11 different locations across 4 continents (Asia, Europe, North America, South America).
We think everybody deserves high quality, spam-free search results. We hope that this report provides a glimpse of what we do to make that happen.
Posted by Michal Wicinski, Search Quality Strategist and Kiyotaka Tanaka, User Education & Outreach Specialist
How we fought webspam – Webspam Report 2016
With 2017 well underway, we wanted to take a moment and share some of the insights we gathered in 2016 in our fight against webspam. Over the past year, we continued to find new ways of keeping spam from creating a poor quality search experience, and worked with webmasters around the world to make the web better.
We do a lot behind the scenes to make sure that users can make full use of what today’s web has to offer, bringing relevant results to everyone around the globe, while fighting webspam that could potentially harm or simply annoy users.
Webspam trends in 2016
- Website security continues to be a major source of concern. Last year we saw more hacked sites than ever – a 32% increase compared to 2015. Because of this, we continued to invest in improving and creating more resources to help webmasters know what to do when their sites get hacked.
- We continued to see that sites are compromised not just to host webspam. We saw a lot of webmasters affected by social engineering, unwanted software, and unwanted ad injectors. We took a stronger stance in Safe Browsing to protect users from deceptive download buttons, made a strong effort to protect users from repeatedly dangerous sites, and we launched more detailed help text within the Search Console Security Issues Report.
- Since more people are searching on Google using a mobile device, we saw a significant increase in spam targeting mobile users. In particular, we saw a rise in spam that redirects users, without the webmaster’s knowledge, to other sites or pages, inserted into webmaster pages using widgets or via ad units from various advertising networks.
How we fought spam in 2016
- We continued to refine our algorithms to tackle webspam. We made multiple improvements to how we rank sites, including making Penguin (one of our core ranking algorithms) work in real-time.
- The spam that we didn’t identify algorithmically was handled manually. We sent over 9 million messages to webmasters to notify them of webspam issues on their sites. We also started providing more security notifications via Google Analytics.
- We performed algorithmic and manual quality checks to ensure that websites with structured data markup meet quality standards. We took manual action on more than 10,000 sites that did not meet the quality guidelines for inclusion in search features powered by structured data.
Working with users and webmasters for a better web
- In 2016 we received over 180,000 user-submitted spam reports from around the world. After carefully checking their validity, we considered 52% of those reported sites to be spam. Thanks to all who submitted reports and contributed towards a cleaner and safer web ecosystem!
- We conducted more than 170 online office hours and live events around the world to audiences totaling over 150,000 website owners, webmasters and digital marketers.
- We continued to provide support to website owners around the world through our Webmaster Help Forums in 15 languages. Through these forums we saw over 67,000 questions, with a majority of them being identified as having a Best Response by our community of Top contributors, Rising Stars and Googlers.
- We had 119 volunteer Webmaster Top Contributors and Rising Stars, whom we invited to join us at our local Top Contributor Meetups in 11 different locations across 4 continents (Asia, Europe, North America, South America).
We think everybody deserves high quality, spam-free search results. We hope that this report provides a glimpse of what we do to make that happen.
Posted by Michal Wicinski, Search Quality Strategist and Kiyotaka Tanaka, User Education & Outreach Specialist
Similar items: Rich products feature on Google Image Search
Image Search recently launched “Similar items” on mobile web and the Android Search app. The “Similar items” feature is designed to help users find products they love in photos that inspire them on Google Image Search. Using machine vision technology, the Similar items feature identifies products in lifestyle images and displays matching products to the user. Similar items supports handbags, sunglasses, and shoes and will cover other apparel and home & garden categories in the next few months.
The Similar items feature enables users to browse and shop inspirational fashion photography and find product info about items they’re interested in. Try it out by opening results from queries like [designer handbags].
Finding price and availability information was one of the top Image Search feature requests from our users. The Similar items carousel gets millions of impressions and clicks daily from all over the world.
To make your products eligible for Similar items, make sure to add and maintain schema.org product metadata on your pages. The schema.org/Product markup helps Google find product offerings on the web and give users an at-a-glance summary of product info.
To ensure that your products are eligible to appear in Similar items:
- Ensure that the product offerings on your pages have schema.org product markup, including an image reference. Products with name, image, price & currency, and availability meta-data on their host page are eligible for Similar items
- Test your pages with Google’s Structured Data Testing Tool to verify that the product markup is formatted correctly
- See your images on image search by issuing the query “site:yourdomain.com.” For results with valid product markup, you may see product information appear once you tap on the images from your site. It can take up to a week for Googlebot to recrawl your website.
Right now, Similar items is available on mobile browsers and the Android Google Search App globally, and we plan to expand to more platforms in 2017.
If you have questions, find us in the dedicated Structured data section of our forum, on Twitter, or on Google+. To prevent your images from showing in Similar items, webmasters can opt-out of Google Image Search.
We’re excited to help users find your products on the web by showcasing buyable items. Thanks for partnering with us to make the web more shoppable!
Posted by Julia E, Product Manager on Image Search