New reports for Special Announcements in Search Console
Last month we introduced a new way for sites to highlight COVID-19 announcements on Google Search. At first, we’re using this information to highlight announcements in Google Search from health and government agency sites, to cover important updates like school closures or stay-at-home directives.
Today we are announcing support for SpecialAnnouncement in Google Search Console, including new reports to help you find any issues with your implementation and monitor how this rich result type is performing. In addition we now also support the markup on the Rich Results Test to review your existing URLs or debug your markup code before moving it to production.
Special Announcements Enhancement report
A new report is now available in Search Console for sites that have implemented SpecialAnnouncement structured data. The report allows you to see errors, warnings, and valid pages for markup implemented on your site.
In addition, if you fix an issue, you can use the report to validate it, which will trigger a process where Google recrawls your affected pages. Learn more about the Rich result status reports.
![]() |
Image: Special Announcements Enhancement report |
Special Announcements appearance in Performance report
The Search Console Performance report now also allows you to see the performance of your SpecialAnnouncement marked-up pages on Google Search. This means that you can check the impressions, clicks and CTR results of your special announcement pages and check their performance to understand how they are trending for any of the dimensions available. Learn more about the Search appearance tab in the performance report.
![]() |
Image: Special Announcements appearance in Performance report |
Special Announcements in Rich Results Test
After adding SpecialAnnouncement structured data to your pages, you can test them using the Rich Results Test tool. If you haven’t published the markup on your site yet, you can also upload a piece of code to check the markup. The test shows any errors or suggestions for your structured data.
![]() |
Image: Special Announcements in Rich Results Test |
These new tools should make it easier to understand how your marked-up SpecialAnnouncement pages perform on Search and to identify and fix issues.
If you have any questions, check out the Google Webmasters community.
Posted by Daniel Waisberg, Search Advocate & Moshe Samet, Search Console PM.
Showcasing the value of SEO
Each year we attend dozens of events and reach thousands of people with our keynotes, talks, and Q&As. We go to conferences and meetups, because we believe that our talks can potentially help online businesses flourish and we get to help people with their search related problems, but sometimes also listen to their success stories. It’s really uplifting when we hear that, by following our advice, they achieved something great!
We want people to hear about these success stories, so we’re starting a new blog post series that features case studies. They may, for example, help with convincing a boss’ boss that investing in SEO or implementing structured data can be good for the business.
In this first blog post we’re going to start with the overall basics of investing in Search Engine Optimization (SEO), and how investing in it helped a company.
We hope you’ll find this blog post useful. If you’re interested in contributing a case study, submit a talk proposal when signing up for a Webmaster Conference near you and we will consider featuring it. For more case studies and help content, head over to our developer site, help center, or YouTube channel. If you wanna get in touch with us, find us on Twitter.
Posted by Alice Kim and The Gary
Moon Tae Sung is a SEO Manager at Saramin, one of the largest job platforms in Korea. We had the opportunity to ask him a few questions about the effects of his team’s work on Google Search after a presentation he did at a Webmaster Conference in Seoul.
Saramin offers job posting recommendations, company and salary information, AI-based interviews, and AI-based headhunting services. According to Tae Sung, “people come to the Saramin site not only to look for jobs and submit applications, but to also gain a variety of information related to job searches and receive high-quality AI-based services for interview preparation.”
Saramin’s SEO process started with Google Search Console. In 2015 they verified the site in the tool and spent a year identifying and fixing crawling issues. “The task was simple, but still resulted in a 15% increase in the organic traffic“, Tae Sung said. The ROI prompted Saramin to invest more in SEO with the aim of even greater potential success. But first they needed to learn more about what else makes a site search engine friendly so they can better look for help resources. “We studied the Google Search developer’s guide and Help Center articles. These resources continue to provide up-to-date information for issues that we run into“, he told us.
SEO is a process that may take time to bear fruit, so they “started following the SEO guidelines more closely and implemented more changes. The goal was to make changes to the site so that Google Search would better understand it”, Tae Sung shared. They removed meta tags that were cluttered with unnecessary and unhelpful keywords, they used rel-canonical and removed duplicate content, and they explored the search gallery and applied applicable structured data, starting with Job Posting, Breadcrumb, and Estimated salary.
In addition, they used various Google tools offered as they worked on improving their site. “Errors on our structured data are dealt with by checking URLs on the Structured Data Testing Tool. Other tools like Mobile Friendly Test, AMP Test, and PageSpeed Insight provide us valuable insights for making improvements and helping us offer a better experience for our users,” said Tae Sung.
Over time, Saramin saw the red-colored errors on Search Console’s Index Coverage report gradually turning valid green, and they knew they were headed in the right direction. The incremental changes reached a tipping point and the traffic continued to rise at a more remarkable speed. In the peak hiring season of September 2019, traffic doubled compared to the previous year.
“We are very happy about the traffic increase, but what’s more exciting is it also accompanied improvement in the quality of the traffic. We saw a 93% increase in the number of new sign ups and a 9% increase on the conversion. We believe this means Saramin’s optimization work was found delightful by our users,” said Tae Sung.
Saramin continues to invest in achieving their SEO goals. They’re trying to enhance their users’ experience by implementing more technologies and features from Google, and Tae Sung is enthusiastic about their work ahead: “This is only the beginning of our story.”
Looking back at last year’s Webmaster Conference Product Summit
As a part of the Webmaster Conference series, last fall we held a Product Summit at Google’s headquarters in Mountain View, California. It was slightly different from our previous events, with a number of product managers and engineers from Google Search taking part. We recorded the talks held there, and are happy to be able to make these available to all of you now.
In the playlist you’ll find:
- Web deduplication – How does Google recognize duplicate content across the web? What happens once a duplicate is found? How is a canonical URL selected? How does localization play a role?
- Google Images best practices – Take a look at how Google Images has evolved over the years, and learn about some of the best practices that you can implement on your site when it comes to images.
- Rendering – Find out more about rendering, and what it takes to do rendering of the web at scale. Take a look behind the scenes, and learn about some things a site owner could watch out for with regards to rendering.
- Titles, snippets, and result previews – What’s the goal of titles, snippets, and previews in Search? How do Google’s systems pick and generate a preview for a page? What are some of the elements that help users decide which page to click in Search?
- Googlebot & web hosting – Starting with a look at the popularity of different web servers, and the growth of HTTPS, you’ll find out more about how Google’s crawling for Search works, and what you can do to control it.
- Claim your Knowledge Panel – Knowledge Panels are a great way for people and organizations to be visible in Search. Find out more about the ways you can claim and update them for yourself or for your business.
- Improving Search over the years – Are dogs the same as cats? Should pages about New York be shown when searching for York? How could algorithms ever figure this out? How many 😊’s does it take to get Google’s attention? Google’s Paul Haahr takes you on a tour of some changes in Search.
We hope you find these videos insightful, useful, and a bit entertaining! And if you are not subscribed to the Webmasters Youtube channel, here’s your chance!
Posted by John Mueller, Search Advocate, Google Switzerland
Introducing a new way for sites to highlight COVID-19 announcements on Google Search
Due to the COVID-19 outbreak, many organizations and groups are publishing important coronavirus-related announcements that affect our everyday lives.
In response, we’re introducing a new way for these special announcements to be highlighted on Google Search. Sites can add SpecialAnnouncement structured data to their web pages or submit a COVID-19 announcement in Search Console.
At first, we’re using this information to highlight announcements in Google Search from health and government agency sites, to cover important updates like school closures or stay-at-home directives.
We are actively developing this feature, and we hope to expand it to include more sites. While we might not immediately show announcements from other types of sites, seeing the markup will help us better understand how to expand this feature.
Please note: beyond special announcements, there are a range of other options that sites can use to highlight information such as canceled events or changes to business hours. You can learn more about these at the end of this post.
How COVID-19 announcements appear in Search
When SpecialAnnouncement structured data is added to a page, that content can be eligible to appear with a COVID-19 announcement rich result, in addition to the page’s regular snippet description. A COVID-19 announcement rich result can contain a short summary that can be expanded to view more more. Please note that the format may change over time, and you may not see results in Google Search right away.
How to implement your COVID-19 announcements
There are two ways that you can implement your COVID-19 announcements.
RECOMMENDED: Add structured data to your web page
Structured data is a standardized format for providing information about a page and classifying the page content. We recommend using this method because it is the easiest way for us to take in this information, it enables reporting through Search Console in the future, and enables you to make updates. Learn how to add structured data to COVID-19 announcements.
ALTERNATIVE: Submit announcements in Search Console
If you don’t have the technical ability or support to implement structured data, you can submit a COVID-19 announcement in Search Console. This tool is still in beta testing, and you may see changes.
This method is not preferred and is intended only as a short-term solution. With structured data, your announcement highlights can automatically update when your pages change. With the tool, you’ll have to manually update announcements. Also, announcements made this way cannot be monitored through special reporting that will be made available through Search Console in the future.
If you do need to submit this way, you’ll need to first be verified in Search Console. Then you can submit a COVID-19 announcement:
More COVID-19 resources for sites from Google Search
Beyond special announcements markup, there are other ways you can highlight other types of activities that may be impacted because of COVID-19:
- Best practices for health and government sites: If you are a representative of a health or government website, and you have important information about coronavirus for the general public, here are some recommendations for how to make this information more visible on Google Search.
- Surface your common FAQs: If your site has common FAQs, adding FAQ markup can help Google Search surface your answers.
- Pausing your business online: See our blog post on how to pause your business online in a way that minimizes impacts with Google Search.
- Business hours & temporary closures: Review the guidance from Google My Business on how to change your business hours or indicate temporary closures or how to create COVID-19 posts.
- Events: If you hold events, look over the new properties for marking them virtual, postponed, or canceled.
- Knowledge Panels: Understand how to recommend changes to your Google knowledge panel (or how to claim it, if you haven’t already).
- Fix an overloaded server: Learn how to determine a server’s bottleneck, quickly fix the bottleneck, improve server performance, and prevent regressions.
If you have any questions or comments, please let us know on Twitter.
Posted by Lizzi Harvey, Technical Writer, Search Relations, and Danny Sullivan, Public Liaison for Search
Helping health organizations make COVID-19 information more accessible
Health organizations are busier than ever providing information to help with the COVID-19 pandemic. To better assist them, Google has created a best practices article to guide health organizations to make COVID-19 information more accessible on Search. We’ve also created a new technical support group for eligible health organizations.
Best practices for search visibility
By default, Google tries to show the most relevant, authoritative information in response to any search. This process is more effective when content owners help Google understand their content in appropriate ways.
To better guide health-related organizations in this process (known as SEO, for “search engine optimization”), we have produced a new help center article with some important best practices, with emphasis on health information sites, including:
-
How to help users access your content on the go
-
The importance of good page content and titles
-
Ways to check how your site appears for coronavirus-related queries
-
How to analyze the top coronavirus related user queries
-
How to add structured data for FAQ content
New support group for health organizations
In addition to our best practices help page, health organizations can take part in our new technical support group that’s focused on helping health organizations who publish COVID-19 information with Search related questions.
We’ll be approving requests for access on a case-by-case basis. At first we’ll be accepting only domains under national health ministries and US state level agencies. We’ll inform of future expansions here in this blog post, and on our Twitter account. You’ll need to register using either an email under those domains (e.g. name@health.gov) or have access to the website Search Console account.
Fill this form to request access to the COVID-19 Google Search group
The group was created to respond to the current needs of health organizations, and we intend to deprecate the group as soon as COVID-19 is no longer considered a Public Health Emergency by WHO or some similar deescalation is widely in place.
Everyone is welcome to use our existing webmaster help forum, and if you have any questions or comments, please let us know on Twitter.
Posted by Daniel Waisberg, Search Advocate & Ofir Roval, Search Console Lead PM
How to pause your business online in Google Search
As the effects of the coronavirus grow, we’ve seen businesses around the world looking for ways to pause their activities online. With the outlook of coming back and being present for your customers, here’s an overview of our recommendations of how to pause your business online and minimize impacts with Google Search. These recommendations are applicable to any business with an online presence, but particularly for those who have paused the selling of their products or services online. For more detailed information, also check our developer documentation.
Recommended: limit site functionality
If your situation is temporary and you plan to reopen your online business, we recommend keeping your site online and limiting the functionality. For example, you might mark items as out of stock, or restrict the cart and checkout process. This is the recommended approach since it minimizes any negative effects on your site’s presence in Search. People can still find your products, read reviews, or add wishlists so they can purchase at a later time.
It’s also a good practice to:
- Disable the cart functionality: Disabling the cart functionality is the simplest approach, and doesn’t change anything for your site’s visibility in Search.
- Tell your customers what’s going on: Display a banner or popup div with appropriate information for your users, so that they’re aware of the business’s status. Mention any known and unusual delays, shipping times, pick-up or delivery options, etc. upfront, so that users continue with the right expectations. Make sure to follow our guidelines on popups and banners.
- Update your structured data: If your site uses structured data (such as Products, Books, Events), make sure to adjust it appropriately (reflecting the current product availability, or changing events to cancelled). If your business has a physical storefront, update Local Business structured data to reflect current opening hours.
- Check your Merchant Center feed: If you use Merchant Center, follow the best practices for the availability attribute.
- Tell Google about your updates: To ask Google to recrawl a limited number of pages (for example, the homepage), use Search Console. For a larger number of pages (for example, all of your product pages), use sitemaps.
For more information, check our developers documentation.
Not recommended: disabling the whole website
As a last resort, you may decide to disable the whole website. This is an extreme measure that should only be taken for a very short period of time (a few days at most), as it will otherwise have significant effects on the website in Search, even when implemented properly. That’s why it’s highly recommended to only limit your site’s functionality instead. Keep in mind that your customers may also want to find information about your products, your services, and your company, even if you’re not selling anything right now.
If you decide that you need to do this (again, which we don’t recommend), here are some options:
- If you need to urgently disable the site for 1-2 days, then return an informational error page with a 503 HTTP result code instead of all content. Make sure to follow the best practices for disabling a site.
- If you need to disable the site for a longer time, then provide an indexable homepage as a placeholder for users to find in Search by using the 200 HTTP status code.
- If you quickly need to hide your site in Search while you consider the options, you can temporarily remove it from Search.
For more information, check our developers documentation.
Proceed with caution: To elaborate why we don’t recommend disabling the whole website, here are some of the side effects:
- Your customers won’t know what’s happening with your business if they can’t find your business online at all.
- Your customers can’t find or read first-hand information about your business and its products & services. For example, reviews, specs, past orders, repair guides, or manuals won’t be findable. Third-party information may not be as correct or comprehensive as what you can provide. This often also affects future purchase decisions.
- Knowledge Panels may lose information, like contact phone numbers and your site’s logo.
- Search Console verification will fail, and you will lose all access to information about your business in Search. Aggregate reports in Search Console will lose data as pages are dropped from the index.
- Ramping back up after a prolonged period of time will be significantly harder if your website needs to be reindexed first. Additionally, it’s uncertain how long this would take, and whether the site would appear similarly in Search afterwards.
Other things to consider
Beyond the operation of your web site, there are other actions you might want to take to pause your online business in Google Search:
- If you hold events, look over the new properties for making them virtual, postponed or canceled.
- Review the guidance from Google My Business on how to change your business hours or indicate temporary closures.
- Review the resources from Google for Small Business on how to communicate with customers and employees, for working remotely and modifying advertising campaigns.
- Understand how to recommend changes to your Google knowledge panel (or how to claim it, if you haven’t already).
Also be sure to keep up with the latest by following updates on Twitter from Google Webmasters at @GoogleWMC and Google My Business at @GoogleMyBiz.
FAQs
What if I only close the site for a few weeks?
Completely closing a site even for just a few weeks can have negative consequences on Google’s indexing of your site. We recommend limiting the site functionality instead. Keep in mind that users may also want to find information about your products, your services, and your company, even if you’re currently not selling anything.
What if I want to exclude all non-essential products?
That’s fine. Make sure that people can’t buy the non-essential products by limiting the site functionality.
Can I ask Google to crawl less during this time?
Yes, you can limit crawling with Search Console, though it’s not recommended for most cases. This may have some impact on the freshness of your results in Search. For example, it may take longer for Search to reflect that all of your products are currently not available. On the other hand, if Googlebot’s crawling causes critical server resource issues, this is a valid approach. We recommend setting a reminder for yourself to reset the crawl rate once you start planning to go back in business.
How do I get a page indexed or updated quickly?
To ask Google to recrawl a limited number of pages (for example, the homepage), use Search Console. For a larger number of pages (for example, all of your product pages), use sitemaps.
What if I block a specific region from accessing my site?
Google generally crawls from the US, so if you block the US, Google Search generally won’t be able to access your site at all. We don’t recommend that you block an entire region from temporarily accessing your site; instead, we recommend limiting your site’s functionality for that region.
Should I use the Removals Tool to remove out-of-stock products?
No. People won’t be able to find first-hand information about your products on Search, and there might still be third-party information for the product that may be incorrect or incomplete. It’s better to still allow that page, and mark it out of stock. That way people can still understand what’s going on, even if they can’t purchase the item. If you remove the product from Search, people don’t know why it’s not there.
——–
We realize that any business closure is a big and stressful step, and not everyone will know what to do. If you notice afterwards that you could have done something differently, everything’s not lost: we try to make our systems robust so that your site will be back in Search as quickly as possible. Like you, we’re hoping that this crisis finds an end as soon as possible. We hope that with this information, you’re able to have your online business up & running quickly when that time comes. Should you run into any problems or questions along the way, please don’t hesitate to use our public channels to get help.
Posted by John Mueller, working from home in Zurich, Switzerland
New properties for virtual, postponed, and canceled events
In the current environment and status of COVID-19 around the world, many events are being canceled, postponed, or moved to an online-only format. Google wants to show users the latest, most accurate information about your events in this fast-changing e…
Coronavirus (COVID-19) and Webmaster Conference
Last year we organized Webmaster Conference events in over 15 countries. The spirit of WMConf is reflected in that number: we want to reach regions that otherwise don’t get much search conference love. Recently we have been monitoring the coronavirus (COVID-19) situation and its impact on planning this year’s events.
To wit, with the growing concern around coronavirus and in line with the travel guidelines published by WHO, the CDC, and other organizations, we are postponing all Webmaster Conference events globally. While we’re hoping to organize events later this year, we’re also exploring other ways to reach targeted audiences.
We’re very sorry to delay the opportunity to connect in person, but we feel strongly that the safety and health of all attendees is the priority at this time. If you want to get notified about future events in your region, you can sign up to receive updates on the Webmaster Conference site. If you have questions or comments, catch us on Twitter!
Posted by Cherry Prommawin and Gary
Announcing mobile first indexing for the whole web
It’s been a few years now that Google started working on mobile-first indexing – Google’s crawling of the web using a smartphone Googlebot. From our analysis, most sites shown in search results are good to go for mobile-first indexing, and 70% of those shown in our search results have already shifted over. To simplify, we’ll be switching to mobile-first indexing for all websites starting September 2020. In the meantime, we’ll continue moving sites to mobile-first indexing when our systems recognize that they’re ready.
When we switch a domain to mobile-first indexing, it will see an increase in Googlebot’s crawling, while we update our index to your site’s mobile version. Depending on the domain, this change can take some time. Afterwards, we’ll still occasionally crawl with the traditional desktop Googlebot, but most crawling for Search will be done with our mobile smartphone user-agent. The exact user-agent name used will match the Chromium version used for rendering.
In Search Console, there are multiple ways to check for mobile-first indexing. The status is shown on the settings page, as well as in the URL Inspection Tool, when checking a specific URL with regards to its most recent crawling.
Our guidance on making all websites work well for mobile-first indexing continues to be relevant, for new and existing sites. In particular, we recommend making sure that the content shown is the same (including text, images, videos, links), and that meta data (titles and descriptions, robots meta tags) and all structured data is the same. It’s good to double-check these when a website is launched or significantly redesigned. In the URL Testing Tools you can easily check both desktop and mobile versions directly. If you use other tools to analyze your website, such as crawlers or monitoring tools, use a mobile user-agent if you want to match what Google Search sees.
While we continue to support various ways of making mobile websites, we recommend responsive web design for new websites. We suggest not using separate mobile URLs (often called “m-dot”) because of issues and confusion we’ve seen over the years, both from search engines and users.
Mobile-first indexing has come a long way. It’s great to see how the web has evolved from desktop to mobile, and how webmasters have helped to allow crawling & indexing to match how users interact with the web! We appreciate all your work over the years, which has helped to make this transition fairly smooth. We’ll continue to monitor and evaluate these changes carefully. If you have any questions, please drop by our Webmaster forums or our public events.
Posted by John Mueller, Developer Advocate, Google Zurich
Best Practices for News coverage with Search
Having up-to-date information during large, public events is critical, as the landscape changes by the minute. This guide highlights some tools that news publishers can use to create a data rich and engaging experience for their users.
Add Article structured data to AMP pages
Adding Article structured data to your news, blog, and sports article AMP pages can make the content eligible for an enhanced appearance in Google Search results. Enhanced features may include placement in the Top stories carousel, host carousel, and Visual stories. Learn how to mark up your article.
You can now test and validate your AMP article markup in the Rich Results Test tool. Enter your page’s URL or a code snippet, and the Rich Result Test shows the AMP Articles that were found on the page (as well as other rich result types), and any errors or suggestions for your AMP Articles. You can also save the test history and share the test results.
We also recommend that you provide a publication date so that Google can expose this information in Search results, if this information is considered to be useful to the user.
Mark up your live-streaming video content
If you are live-streaming a video during an event, you can be eligible for a LIVE badge by marking your video with BroadcastEvent. We strongly recommend that you use the Indexing API to ensure that your live-streaming video content gets crawled and indexed in a timely way. The Indexing API allows any site owner to directly notify Google when certain types of pages are added or removed. This allows Google to schedule pages for a fresh crawl, which can lead to more relevant user traffic as your content is updated. For websites with many short-lived pages like livestream videos, the Indexing API keeps content fresh in search results. Learn how to get started with the Indexing API.
For AMP pages: Update the cache and use components
Use the following to ensure your AMP content is published and up-to-date the moment news breaks.
Update the cache
When people click an AMP page, the Google AMP Cache automatically requests updates to serve fresh content for the next person once the content has been cached. However, if you want to force an update to the cache in response to a change in the content on the origin domain, you can send an update request to the Google AMP Cache. This is useful if your pages are changing in response to a live news event.
Use news-related AMP components
- <amp-live-list>: Add live content to your article and have it updated based on a source document. This is a great choice if you just want content to reload easily, without having to set up or configure additional services on your backend. Learn how to implement <amp-live-list>.
- <amp-script>: Run your own JavaScript inside of AMP pages. This flexibility means that anything you are publishing on your desktop or non-AMP mobile pages, you can bring over to AMP. <amp-script> supports Websockets, interactive SVGs, and more. This allows you to create engaging news pages like election coverage maps, live graphs and polls etc. As a newer feature, the AMP team is actively soliciting feedback on it. If for some reason it doesn’t work for your use case, let us know.
If you have any questions, let us know through the forum or on Twitter.
Posted by Patrick Kettner and Naina Raisinghani, AMP team
More & better data export in Search Console
We have heard users ask for better download capabilities in Search Console loud and clear – so we’re happy to let you know that more and better data is available to export.
You’ll now be able to download the complete information you see in almost all Search Console reports (instead of just specific table views). We believe that this data will be much easier to read outside SC and store it for your future reference (if needed). You’ll find a section at the end of this post describing other ways to use your Search Console data outside the tool.
Enhancement reports and more
When exporting data from a report, for example AMP status, you’ll now be able to export the data behind the charts, not only the details table (as previously). This means that in addition to the list of issues and their affected pages, you’ll also see a daily breakdown of your pages, their status, and impressions received by them on Google Search results. If you are exporting data from a specific drill-down view, you can see the details describing this view in the exported file.
If you choose Google Sheets or Excel (new!) you’ll get a spreadsheet with two tabs, and if you choose to download as csv, you’ll get a zip file with two csv files.
Here is a sample dataset downloaded from the AMP status report. We changed the titles of the spreadsheet to be descriptive for this post, but the original title includes the domain name, the report, and the date of the export.
Performance report
When it comes to Performance data, we have two improvements:
-
You can now download the content of all tabs with one click. This means that you’ll now get the data on Queries, Pages, Countries, Devices, Search appearances and Dates, all together. The download output is the same as explained above, Google sheets or Excel spreadsheet with multiple tabs and csv files compressed in a zip file.
-
Along with the performance data, you’ll have an extra tab (or csv file) called “Filters”, which shows which filters were applied when the data was exported.
Here is a sample dataset downloaded from the Performance report.
Additional ways to use Search Console data outside the tool
Since we’re talking about exporting data, we thought we’d take the opportunity to talk about other ways you can currently use Search Console data outside the tool. You might want to do this if you have a specific use case that is important to your company, such as joining the data with another dataset, performing an advanced analysis, or visualizing the data in a different way.
There are two options, depending on the data you want and your technical level.
Search Console API
If you have a technical background, or a developer in your company can help you, you might consider using the Search Console API to view, add, or remove properties and sitemaps, and to run advanced queries for Google Search results data.
We have plenty of documentation on the subject, but here are some links that might be useful to you if you’re starting now:
-
The Overview and prerequisites guide walks you through the things you should do before writing your first client application. You’ll also find more advanced guides in the sidebar of this section, for example a guide on how to query all your search data.
-
The reference section provides details on query parameters, usage limits and errors returned by the API.
-
The API samples provides links to sample code for several programming languages, a great way to get up and running.
Google Data Studio
Google Data Studio is a dashboarding solution that helps you unify data from different data sources, explore it, and tell impactful data stories. The tool provides a Search Console connector to import various metrics and dimensions into your dashboard. This can be valuable if you’d like to see Search Console data side-by-side with data from other tools.
If you’d like to give it a try, you can use this template to visualize your data – click “use template” at the top right corner of the page to connect to your data. To learn more about how to use the report and which insights you might find in it, check this step-by-step guide. If you just want to play with it, here’s a report based on that template with sample data.
Let us know on Twitter if you have interesting use cases or comments about the new download capabilities, or about using Search Console data in general. And enjoy the enhanced data!
Posted by Sion Schori & Daniel Waisberg, Search Console team
How to showcase your events on Google Search
It’s officially 2020 and people are starting to make plans for the year ahead. If you produce any type of event, you can help people discover your events with the event search experience on Google.
Have a concert or hosting a workshop? Event markup allows people to discover your event when they search for “concerts this weekend” or “workshops near me.” People can also discover your event when they search for venues, such as sports stadiums or a local pub. Events may surface in a given venue’s Knowledge Panel to better help people find out what’s happening at that respective location.
Launching in new regions and languages
We recently launched the event search experience in Germany and Spain, which brings the event search experience on Google to nine countries and regions around the world. For a full list of where the event search experience works, check out the list of available languages and regions.
How to get your events on Google
There are three options to make your events eligible to appear on Google:
- If you use a third-party website to post events (for example, you post events on ticketing websites or social platforms), check to see if your event publisher is already participating in the event search experience on Google. One way to check is to search for a popular event shown on the platform and see if the event listing is shown. If your event publisher is integrated with Google, continue to post your events on the third-party website.
- If you use a CMS (for example, WordPress) and you don’t have access to your HTML, check with your CMS to see if there’s a plugin that can add structured data to your site for you. Alternatively, you can use the Data Highlighter to tell Google about your events without editing the HTML of your site.
- If you’re comfortable editing your HTML, use structured data to directly integrate with Google. You’ll need to edit the HTML of the event pages.
Follow best practices
If you’ve already implemented event structured data, we recommend that you review your structured data to make sure it meets our guidelines. In particular, you should:
- Make sure you’re including the required and recommended properties that are outlined in our developer guidelines.
- Make sure your event details are high quality, as defined by our guidelines. For example, use the description field to describe the event itself in more detail instead of repeating attributes such as title, date, location, or highlighting other website functionality.
- Use the Rich Result Test to test and preview your structured data.
Monitor your performance on Search
You can check how people are interacting with your event postings with Search Console:
- Use the Performance Report in Search Console to show event listing or detail view data for a given event posting in Search results. You can automatically pull these results with the Search Console API.
- Use the Rich result status report in Search Console to understand what Google could or could not read from your site, and troubleshoot rich result errors.
If you have any questions, please visit the Webmaster Central Help Forum.
Posted by Emily Fifer, Product Manager
New reports for review snippets in Search Console
When Google finds valid reviews or ratings markup, we may show a rich result that includes stars and other summary info. This rich result can appear directly on search results or as part of a Google Knowledge panel, as shown in the screenshots below.
Today we are announcing support for review snippets in Google Search Console, including new reports to help you find any issues with your implementation and monitor how this rich result type is improving your performance. You can also use the Rich Results Test to review your existing URLs or debug your markup code before moving it to production.
Review snippet Enhancement report
To help site owners make the most of their reviews, a new review snippet report is now available in Search Console for sites that have implemented reviews or ratings structured data. The report allows you to see errors, warnings, and valid pages for markup implemented on your site.
In addition, if you fix an issue, you can use the report to validate it, which will trigger a process where Google recrawls your affected pages. The report is covering all the content types currently supported as review snippets. Learn more about the Rich result status reports.
Review snippet appearance in Performance report
The Search Console Performance report now allows you to see the performance of your review or rating marked-up pages on Google Search and Discover using the new “Review snippet” search appearance filter.
This means that you can check the impressions, clicks and CTR results of your review snippet pages and check their performance to understand how they are trending for any of the dimensions available. For example you can filter your data to see which queries, pages, countries and devices are bringing your review snippets traffic.
Review snippet in Rich Results Test
These new tools should make it easier to understand how your marked-up review snippet pages perform on Search and to identify and fix review issues.
If you have any questions, check out the Google Webmasters community.
Posted by Tomer Hodadi and Yuval Kurtser, Search Console engineering team
Google Search News for January 2020
We hope the year 2020 has started off well for you, and wanted to bring a brief update of some of the changes around Google Search since our last episode. We aim to do this in our YouTube Series called Google Search News.
In the January 2020 episode, we cover:
- Updates in Search Console, a free tool from Google to help you succeed with your website in Google Search. Since the last episode, we celebrated the two-year anniversary of the new Search Console, updated the Discover report, and made the Index Coverage report more comprehensive. Another big change was the new messaging system, which integrates directly with various reports.
- Updated Mobile-First Indexing documentation and some tips, such as making sure your mobile site reflects your full content (we won’t use the desktop version at all, once we switch your site over). Also, if you use separate mobile URLs (commonly called m-dot URLs), make sure to use them consistently within your structured data too.
- The deprecation of support for data-vocabulary.org structured data was recently announced. This markup was mostly used for breadcrumb markup, so if you added that early on, you should double-check the breadcrumb report in Search Console.
- We make regular updates in Google Search — our website on How Search Works has more on the backgrounds, if you’re curious. In this episode we covered BERT – a modern way for computers to understand natural language – as well as various updates mentioned on our Search Liaison & Webmaster Central Twitter profiles.
- Chrome has posted on its handling of mixed-content, and we started sending notices to sites using old HTTPS / TLS protocols.
- Googlebot’s rendering has continued to move forward with the new user-agent, which is being used more and more for crawling.
- and, last but not least, if you’d like to find out more about Search Console, check out our new Search Console training video series!
We hope you find these updates useful! Let us know in the video comments, or on Twitter, if there’s something we can improve on.
Posted by John Mueller, Google Search-Relations team, Zurich
New Removals report in Search Console
There are different tools available for you to report and remove information from Google, in this post we’ll focus on three areas that will be part of the new Search Console report: temporary removals, outdated content and SafeSearch filtering requests.
Temporary removals
A temporary removal request is a way to remove specific content on your site from Google Search results. For example, if you have a URL that you need to take off Google Search quickly, you should use this tool. A successful request lasts about six months, which should be enough for you to find a permanent solution. You have two types of requests available:
-
Temporary remove URL will hide the URL from Google Search results for about six months and clear the cached copy of the page.
-
Clear cache URL clears the cached page and wipes out the page description snippet in Search results until the page is crawled again.
Outdated content
The outdated content section provides information on removal requests made through the public Remove Outdated Content tool, which can be used by anyone (not just site owners) to update search results showing information that is no longer present on a page.
SafeSearch filtering
The SafeSearch filtering section in Search Console shows a history of pages on your site that were reported by Google users as adult content using the SafeSearch Suggestion tool. URLs submitted using this tool are reviewed, and if Google feels that this content should be filtered from SafeSearch results, these URLs are tagged as adult content.
We hope you will find the new report clearer and useful. As always, please let us know if you have any comments, questions or feedback either through the Webmasters help community or Twitter.
Posted by Tali Pruss, Search Console Software Engineer
Sunsetting support for data-vocabulary
Structured data schemas such as schema.org and data-vocabulary.org are used to define shared meaningful structures for markup-based applications on the Web. With the increasing usage and popularity of schema.org we decided to focus our development on a…
Get Ready for New SameSite=None; Secure Cookie Settings
In May, Chrome announced a secure-by-default model for cookies, enabled by a new cookie classification system (spec). This initiative is part of our ongoing effort to improve privacy and security across the web.
Chrome plans to implement the new model with Chrome 80 in February 2020. Mozilla and Microsoft have also indicated intent to implement the new model in Firefox and Edge, on their own timelines. While the Chrome changes are still a few months away, It’s important that developers who manage cookies assess their readiness today. This blog post outlines high level concepts; please see SameSite Cookies Explained on web.dev for developer guidance.
Understanding Cross-Site and Same-Site Cookie Context
Websites typically integrate external services for advertising, content recommendations, third party widgets, social embeds and other features. As you browse the web, these external services may store cookies in your browser and subsequently access those cookies to deliver personalized experiences or measure audience engagement. Every cookie has a domain associated with it. If the domain associated with a cookie matches an external service and not the website in the user’s address bar, this is considered a cross-site (or “third party”) context.
Less obvious cross-site use cases include situations where an entity that owns multiple websites uses a cookie across those properties. Although the same entity owns the cookie and the websites, this still counts as cross-site or “third party” context when the cookie’s domain does not match the site(s) from which the cookie is accessed.
When an external resource on a web page accesses a cookie that does not match the site domain, this is cross-site or “third-party” context.
In contrast, cookie access in a same-site (or “first party”) context occurs when a cookie’s domain matches the website domain in the user’s address bar. Same-site cookies are commonly used to keep people logged into individual websites, remember their preferences and support site analytics.
When a resource on a web page accesses a cookie that matches the site the user is visiting, this is same-site or “first party” context.
A New Model for Cookie Security and Transparency
Today, if a cookie is only intended to be accessed in a first party context, the developer has the option to apply one of two settings (SameSite=Lax or SameSite=Strict) to prevent external access. However, very few developers follow this recommended practice, leaving a large number of same-site cookies needlessly exposed to threats such as Cross-Site Request Forgery attacks.
To safeguard more websites and their users, the new secure-by-default model assumes all cookies should be protected from external access unless otherwise specified. Developers must use a new cookie setting, SameSite=None, to designate cookies for cross-site access. When the SameSite=None attribute is present, an additional Secure attribute must be used so cross-site cookies can only be accessed over HTTPS connections. This won’t mitigate all risks associated with cross-site access but it will provide protection against network attacks.
Beyond the immediate security benefits, the explicit declaration of cross-site cookies enables greater transparency and user choice. For example, browsers could offer users fine-grained controls to manage cookies that are only accessed by a single site separately from cookies accessed across multiple sites.
Chrome Enforcement Starting in February 2020
With Chrome 80 in February, Chrome will treat cookies that have no declared SameSite value as SameSite=Lax cookies. Only cookies with the SameSite=None; Secure setting will be available for external access, provided they are being accessed from secure connections. The Chrome Platform Status trackers for SameSite=None and Secure will continue to be updated with the latest launch information.
Mozilla has affirmed their support of the new cookie classification model with their intent to implement the SameSite=None; Secure requirements for cross-site cookies in Firefox. Microsoft recently announced plans to begin implementing the model starting as an experiment in Microsoft Edge 80.
How to Prepare; Known Complexities
If you manage cross-site cookies, you will need to apply the SameSite=None; Secure setting to those cookies. Implementation should be straightforward for most developers, but we strongly encourage you to begin testing now to identify complexities and special cases, such as the following:
- Not all languages and libraries support the None value yet, requiring developers to set the cookie header directly. This Github repository provides instructions for implementing SameSite=None; Secure in a variety of languages, libraries and frameworks.
- Some browsers, including some versions of Chrome, Safari and UC Browser, might handle the None value in unintended ways, requiring developers to code exceptions for those clients. This includes Android WebViews powered by older versions of Chrome. Here’s a list of known incompatible clients.
- App developers are advised to declare the appropriate SameSite cookie settings for Android WebViews based on versions of Chrome that are compatible with the None value, both for cookies accessed via HTTP(S) headers and via Android WebView’s CookieManager API, although the new model will not be enforced on Android WebView until later.
- Enterprise IT administrators may need to implement special policies to temporarily revert Chrome Browser to legacy behavior if some services such as single sign-on or internal applications are not ready for the February launch.
- If you have cookies that you access in both a first and third-party context, you might consider using separate cookies to get the security benefits of SameSite=Lax in the first-party context.
SameSite Cookies Explained offers specific guidance for the situations above, and channels for raising issues and questions.
To test the effect of the new Chrome behavior on your site or cookies you manage, you can go to chrome://flags in Chrome 76+ and enable the “SameSite by default cookies” and “Cookies without SameSite must be secure” experiments. In addition, these experiments will be automatically enabled for a subset of Chrome 79 Beta users. Some Beta users with the experiments enabled could experience incompatibility issues with services that do not yet support the new model; users can opt out of the Beta experiments by going to chrome://flags and disabling them.
If you manage cookies that are only accessed in a same-site context (same-site cookies) there is no required action on your part; Chrome will automatically prevent those cookies from being accessed by external entities, even if the SameSite attribute is missing or no value is set. However we strongly recommend you apply an appropriate SameSite value (Lax or Strict) and not rely on default browser behavior since not all browsers protect same-site cookies by default.
Finally, if you’re concerned about the readiness of vendors and others who provide services to your website, you can check for Developer Tools console warnings in Chrome 77+ when a page contains cross-site cookies that are missing the required settings:
Some providers (including some Google services) will implement the necessary changes in the months leading up to Chrome 80 in February; you may wish to reach out to your partners to confirm their readiness.
Posted by Barb Palser, Chrome and Web Platform Partnerships
The Search Console Training video series is rolling out
As we finished migrating to the new Search Console in 2019, we knew a detailed training video series would help users learn about the product and its many use cases. Below are the videos we have already released and there are many more to come! Check the Search Console Training playlist for a new video every two weeks and subscribe to the Webmasters YouTube channel to get notified about new video uploads.
I hope that by the end of the series you’ll agree with us that Search Console data is insightful, fun and exciting! Let us know what you think via commenting on videos or tagging us on Twitter.
Posted by Daniel Waisberg, Search Advocate
An update on 2019
With 2020 hanging above our heads much the same way that bricks don’t, people start reflecting on what they achieved this year, what went wrong, and how they could improve. We’re no different, but instead of choosing what went well or wrong ourselves, we picked the announcements on our @GoogleWMC Twitter account that users interacted with the most, and decided to reflect on those.
We had launches that you appreciated a lot. For example, we announced at Google I/O that Googlebot is becoming evergreen, meaning that it’s always going to use an up-to-date version of Chromium for rendering. We hope that this will make it easier for developers to create stunning, modern, and snappy JavaScript experiences, by tapping onto the power of over 1000 new features and interfaces that are now supported.
Speaking of robots, together with the original author of the Robots Exclusion Protocol, other search engines, and input from webmasters, we submitted an Internet Draft to the IETF in order to start standardizing the 25-year-old protocol.
Like Twitter users, we also thought it’s an exciting project which lays down the rules of crawling for good, although it doesn’t change anything for most.
But we haven’t stopped there with touching ancient protocols: we also rethought how we need to treat “nofollow” links to keep up with the evolution of the web. It was an announcement that seemed to be welcomed by most Twitter users, and for a good reason: having a “hint” model for rel=”nofollow” may help us reward those who create high quality content more, by serving even better results to our users.
One of the most tweeted – and also most humbling – moments this year was when we lost a part of our index, which caused Search Console to misbehave, and also had rendering failures roughly the same time. Since Google Search works like a well oiled machine most of the time, we didn’t have processes to quickly communicate issues to those who should know about them: webmasters. Lacking a proper process and channel to communicate these issues was a mistake and we are still working hard to rectify it, however one thing is clear: we need to do more on the critical communication side of things.
We do like to communicate, in general: we shoot videos, we go to conferences, big and small, where we reach thousands of webmasters and SEOs, and in 2019 we extended our reach with the Webmaster Conference, which landed in 35 locations around the world in 12 languages. Not to mention the weather reports on our YouTube channel.
We hope you had a fantastic year and the new year will bring you even more success. If you need help with the latter, you can follow our blogs, @googlewmc on Twitter, or you could join us at a Webmaster Conference near you!
Posted by John Mueller, Cheese Connoisseur, and Gary the house elf
Launching a new Publisher Center
Today we are announcing the launch of Publisher Center to help publishers more easily manage how their content appears across Google products. Publisher Center merges two existing tools, Google News Producer and Google News Publisher Center, improving their user experience and functionality.
Publisher Center’s new features include a simpler way to manage your publication’s identity, like updating light and dark theme logos. It also provides an easier way for those that own multiple publications to organize and switch between them, particularly with improved permission settings that make it easier to collaborate with colleagues. Additionally, publishers can now point to the URLs for their website’s sections instead of RSS to configure sections in Google News. Content for News will now come directly from the web, just as it does for Search.
Publisher Center launches today in the existing four languages of the previous tools (English, Spanish, French, and German) and will expand to more languages soon. Learn more here.
Posted by Eric Silva, Product Manager