Dynamic Rendering with Rendertron
Which sites should consider dynamic rendering?
How does dynamic rendering work?
- Take a look at a sample web app
- Set up a small express.js server to serve the web app
- Install and configure Rendertron as a middleware for dynamic rendering
The sample web app
Cute cat images in a grid and a button to show more – this web app truly has it all! Here is the JavaScript: |
const apiUrl = ‘https://api.thecatapi.com/v1/images/search?limit=50’;
The mobile-friendly test shows that the page is mobile-friendly, but the screenshot is missing all the cats! The headline and button appear but none of the cat pictures are there. |
Set up the server
Deploy a Rendertron instance
Rendertron runs a server that takes a URL and returns static HTML for the URL by using headless Chromium. We’ll follow the recommendation from the Rendertron project and use Google Cloud Platform.
The form to create a new Google Cloud Platform project. |
-
Create a new project in the Google Cloud console. Take note of the “Project ID” below the input field.
-
Install the Google Cloud SDK as described in the documentation and log in.
- Clone the Rendertron repository from GitHub with:
git clone https://github.com/GoogleChrome/rendertron.git
cd rendertron
- Run the following commands to install dependencies and build Rendertron on your computer:
npm install && npm run build
- Enable Rendertron’s cache by creating a new file called config.json in the rendertron directory with the following content:
{ “datastoreCache”: true }
- Run the following command from the rendertron directory. Substitute YOUR_PROJECT_ID with your project ID from step 1.
gcloud app deploy app.yaml –project YOUR_PROJECT_ID
-
Select a region of your choice and confirm the deployment. Wait for it to finish.
-
Enter the URL YOUR_PROJECT_ID.appspot.com (substitute YOUR_PROJECT_ID for your actual project ID from step 1 in your browser. You should see Rendertron’s interface with an input field and a few buttons.
Rendertron’s UI after deploying to Google Cloud Platform |
Add Rendertron to the server
Configure the bot list
Add the middleware
Testing our setup
Conclusion
Focusing on the new Search Console
Over the last year, the new Search Console has been growing and growing, with the goal of making it easier for site owners to focus on the important tasks. For us, focus means being able to put in all our work into the new Search Console, being committed to the users, and with that, being able to turn off some of the older, perhaps already-improved, aspects of the old Search Console. This gives us space to further build out the new Search Console, adding and improving features over time.
Here are some of the upcoming changes in Search Console that we’re planning on making towards end of March, 2019:
Crawl errors in the new Index Coverage report
One of the more common pieces of feedback we received was that the list of crawl errors in Search Console was not actionable when it came to setting priorities (it’s normal that Google crawls URLs which don’t exist, it’s not something that needs to be fixed on the website). By changing the focus on issues and patterns used for site indexing, we believe that site owners will be able to find and fix issues much faster (and when issues are fixed, you can request reprocessing quickly too). With this, we’re going to remove the old Crawl Errors report – for desktop, smartphone, and site-wide errors. We’ll continue to improve the way issues are recognized and flagged, so if there’s something that would help you, please submit feedback in the tools.
Along with the Crawl Errors report, we’re also deprecating the crawl errors API that’s based on the same internal systems. At the moment, we don’t have a replacement for this API. We’ll inform API users of this change directly.
Sitemaps data in Index Coverage
As we move forward with the new Search Console, we’re turning the old sitemaps report off. The new sitemaps report has most of the functionality of the old report, and we’re aiming to bring the rest of the information – specifically for images & video – to the new reports over time. Moreover, to track URLs submitted in sitemap files, within the Index Coverage report you can select and filter using your sitemap files. This makes it easier to focus on URLs that you care about.
Using the URL inspection tool to fetch as Google
The new URL inspection tool offers many ways to check and review URLs on your website. It provides both a look into the current indexing, as well as a live check of URLs that you’ve recently changed. In the meantime, this tool shows even more information on URLs, such as the HTTP headers, page resource, the JavaScript console log, and a screenshot of the page. From there, you can also submit pages for re-processing, to have them added or updated in our search results as quickly as possible.
User-management is now in settings
We’ve improved the user management interface and decreased clutter from the tool by merging it with the Settings section of the new Search Console. This replaces the user-management features in the old Search Console.
Structured data dashboard to dedicated reports per vertical
To help you implement Rich Results for you site, we added several reports to the new Search Console last year. These include Jobs, Recipes, Events and Q&A. We are committed to keep adding reports like these to the new Search Console. When Google encounters a syntax error parsing Structured Data on a page, it will also be reported in aggregate to make sure you don’t miss anything critical.
Other Structured Data types that are not supported with Rich Results features, will not be reported in Search Console anymore. We hope this reduces distraction from non-critical issues, and help you to focus on fixing problems which could be visible in Search.
Letting go of some old features
With the focus on features that we believe are critical to site owners, we’ve had to make a hard decision to drop some features in Search Console. In particular:
HTML suggestions – finding short and duplicated titles can be useful for site owners, but Google’s algorithms have gotten better at showing and improving titles over the years. We still believe this is something useful for sites to look into, and there are some really good tools that help you to crawl your website to extract titles & descriptions too.
Property Sets – while they’re loved by some site owners, the small number of users makes it hard to justify maintaining this feature. However, we did learn that users need a more comprehensive view of their website and so we will soon add the option of managing a search console account over an entire domain (regardless of schema type and sub-domains). Stay tuned!
Android Apps – most of the relevant functionality has been moved to the Firebase console over the years.
Blocked resources – we added this functionality to help sites with unblocking of CSS and JavaScript files for mobile-friendliness several years back. In the meantime, these issues have gotten much fewer, the usage of this tool has dropped significantly, and you’re able to find blocked resources directly in the URL inspection tool.
Please send us feedback!
We realize some of these changes will affect your work-flows, so we want to let you know about them as early as possible. Please send us your feedback directly in the new Search Console, if there are aspects which are unclear, or which would ideally be different for your use-case. For more detailed feedback, please use our help forums, feel free to include screenshots & ideas. In the long run, we believe the new Search Console will make things much easier, help you focus on the issues affecting your site, and the opportunities available to your site, with regards to search.
We’re looking forward to an exciting year!
Posted by Hillel Maoz, Search Console Team
Ways to succeed in Google News
General advice
There is a lot of helpful information to consider within the Google News Publisher Help Center. Be sure to have read the material in this area, in particular the content and technical guidelines.
Headlines and dates
- Present clear headlines: Google News looks at a variety of signals to determine the headline of an article, including within your HTML title tag and for the most prominent text on the page. Review our headline tips.
- Provide accurate times and dates: Google News tries to determine the time and date to display for an article in a variety of ways. You can help ensure we get it right by using the following methods:
- Show one clear date and time: As per our date guidelines, show a clear, visible date and time between the headline and the article text. Prevent other dates from appearing on the page whenever possible, such as for related stories.
- Use structured data: Use the
datePublished
anddateModified
schema and use the correct time zone designator for AMP or non-AMP pages.
- Avoid artificially freshening stories: If an article has been substantially changed, it can make sense to give it a fresh date and time. However, don’t artificially freshen a story without adding significant information or some other compelling reason for the freshening. Also, do not create a very slightly updated story from one previously published, then delete the old story and redirect to the new one. That’s against our article URLs guidelines.
Duplicate content
Google News seeks to reward independent, original journalistic content by giving credit to the originating publisher, as both users and publishers would prefer. This means we try not to allow duplicate content—which includes scraped, rewritten, or republished material—to perform better than the original content. In line with this, these are guidelines publishers should follow:
- Block scraped content: Scraping commonly refers to taking material from another site, often on an automated basis. Sites that scrape content must block scraped content from Google News.
- Block rewritten content: Rewriting refers to taking material from another site, then rewriting that material so that it is not identical. Sites that rewrite content in a way that provides no substantial or clear added value must block that rewritten content from Google News. This includes, but is not limited to, rewrites that make only very slight changes or those that make many word replacements but still keep the original article’s overall meaning.
- Block or consider canonical for republished content: Republishing refers to when a publisher has permission from another publisher or author to republish an original work, such as material from wire services or in partnership with other publications.
Publishers that allow others to republish content can help ensure that their original versions perform better in Google News by asking those republishing to block or make use of canonical.
Google News also encourages those that republish material to consider proactively blocking such content or making use of the canonical, so that we can better identify the original content and credit it appropriately. - Avoid duplicate content: If you operate a network of news sites that share content, the advice above about republishing is applicable to your network. Select what you consider to be the original article and consider blocking duplicates or making use of the canonical to point to the original.
Transparency
- Be transparent: Visitors to your site want to trust and understand who publishes it and information about those who have written articles. That’s why our content guidelines stress that content should have posts with clear bylines, information about authors, and contact information for the publication.
- Don’t be deceptive: Our content policies do not allow sites or accounts that impersonate any person or organization, or that misrepresent or conceal their ownership or primary purpose. We do not allow sites or accounts that engage in coordinated activity to mislead users. This includes, but isn’t limited to, sites or accounts that misrepresent or conceal their country of origin or that direct content at users in another country under false premises.
More tips
- Avoid taking part in link schemes: Don’t participate in link schemes, which can include large-scale article marketing programs or selling links that pass PageRank. Review our page on link schemes for more information.
- Use structured for rich presentation: Both those using AMP and non-AMP pages can make use of structured data to optimize your content for rich results or carousel-like presentations.
- Protect your users and their data: Consider securing every page of your website with HTTPS to protect the integrity and confidentiality of the data users exchange on your site. You can find more useful tips in our best practices on how to implement HTTPS.
Here’s to a great 2019!
We hope these tips help publishers succeed in Google News over the coming year. For those who have more questions about Google News, we are unable to do one-to-one support. However, we do monitor our Google News Publisher Forum—which has been newly-revamped—and try to provide guidance on questions that might help a number of publishers all at once. The forum is also a great resource where publishers share tips and advice with each other.
Posted by Danny Sullivan, Public Liaison for Search
An update on the Google Webmaster Central blog comments
For every train there’s a passenger, but it turns out comments are not our train.Over the years we read thousands of comments we’ve received on our blog posts on the Google Webmaster Central blog. Sometimes they were extremely thoughtful, other times t…
2018, celebrating our global Webmaster community
Gold Webmaster Product Experts at this year’s global summit in Sunnyvale. |
Mobile-First indexing, structured data, images, and your site
It’s been two years since we started working on “mobile-first indexing” – crawling the web with smartphone Googlebot, similar to how most users access it. We’ve seen websites across the world embrace the mobile web, making fantastic websites that work on all kinds of devices. There’s still a lot to do, but today, we’re happy to announce that we now use mobile-first indexing for over half of the pages shown in search results globally.
Checking for mobile-first indexing
In general, we move sites to mobile-first indexing when our tests assure us that they’re ready. When we move sites over, we notify the site owner through a message in Search Console. It’s possible to confirm this by checking the server logs, where a majority of the requests should be from Googlebot Smartphone. Even easier, the URL inspection tool allows a site owner to check how a URL from the site (it’s usually enough to check the homepage) was last crawled and indexed.
If your site uses responsive design techniques, you should be all set! For sites that aren’t using responsive web design, we’ve seen two kinds of issues come up more frequently in our evaluations:
Missing structured data on mobile pages
Structured data is very helpful to better understand the content on your pages, and allows us to highlight your pages in fancy ways in the search results. If you use structured data on the desktop versions of your pages, you should have the same structured data on the mobile versions of the pages. This is important because with mobile-first indexing, we’ll only use the mobile version of your page for indexing, and will otherwise miss the structured data.
Testing your pages in this regard can be tricky. We suggest testing for structured data in general, and then comparing that to the mobile version of the page. For the mobile version, check the source code when you simulate a mobile device, or use the HTML generated with the mobile-friendly testing tool. Note that a page does not need to be mobile-friendly in order to be considered for mobile-first indexing.
Missing alt-text for images on mobile pages
The value of alt-attributes on images (“alt-text”) is a great way to describe images to users with screen-readers (which are used on mobile too!), and to search engine crawlers. Without alt-text for images, it’s a lot harder for Google Images to understand the context of images that you use on your pages.
Check “img” tags in the source code of the mobile version for representative pages of your website. As above, the source of the mobile version can be seen by either using the browser to simulate a mobile device, or by using the Mobile-Friendly test to check the Googlebot rendered version. Search the source code for “img” tags, and double-check that your page is providing appropriate alt-attributes for any that you want to have findable in Google Images.
For example, that might look like this:
With alt-text (good!):<img src="cute-puppies.png" alt="A photo of cute puppies on a blanket">
Without alt-text:<img src="sad-puppies.png">
It’s fantastic to see so many great websites that work well on mobile! We’re looking forward to being able to index more and more of the web using mobile-first indexing, helping more users to search the web in the same way that they access it: with a smartphone. We’ll continue to monitor and evaluate this change carefully. If you have any questions, please drop by our Webmaster forums or our public events.
Posted by John Mueller, wearer of many socks, Google Switzerland
Why & how to secure your website with the HTTPS protocol
You can find the whole session, about one hour long, in this video:
- What HTTPS encryption is, and why it is important to protect your visitors and yourself,
- How HTTPS enables a more modern web,
- What are the usual complaints about HTTPS, and are they still true today?
- “But HTTPS certificates cost so much money!”
- “But switching to HTTPS will destroy my SEO!”
- “But “mixed content” is such a headache!”
- “But my ad revenue will get destroyed!”
- “But HTTPS is sooooo sloooow!”
- Some practical advice to run the migration. Those are an aggregation of:
- The “site move with URL changes” documentation
- General level advice on which HTTPS specifications to choose (HSTS, encryption key strength, etc…)
Introducing the Indexing API and structured data for livestreams
Over the past few years, it’s become easier than ever to stream live videos online, from celebrity updates to special events. But it’s not always easy for people to determine which videos are live and know when to tune in. Today, we’re introducing new …
Rich Results expands for Question & Answer pages
Frequently, the information they’re looking for is on sites where users ask and answer each other’s questions. Popular social news sites, expert forums, and help and support message boards are all examples of this pattern.
![]() |
A screenshot of an example search result for a page titled “Why do touchscreens sometimes register a touch when …” with a preview of the top answers from the page.
|
In order to help users better identify which search results may give the best information about their question, we have developed a new rich result type for question and answer sites. Search results for eligible Q&A pages display a preview of the top answers. This new presentation helps site owners reach the right users for their content and helps users get the relevant information about their questions faster.
![]() |
A screenshot of an example search result for a page titled “Why do touchscreens sometimes register a touch when …” with a preview of the top answers from the page. |
To be eligible for this feature, add Q&A structured data to your pages with Q&A content. Be sure to use the Structured Data Testing Tool to see if your page is eligible and to preview the appearance in search results. You can also check out Search Console to see aggregate stats and markup error examples. The Performance report also tells you which queries show your Q&A Rich Result in Search results, and how these change over time.
If you have any questions, ask us in the Webmaster Help Forum or reach out on Twitter!
Posted by Kayla Hanson, Software Engineer
PageSpeed Insights, now powered by Lighthouse
At Google, we know that speed matters and we provide a variety of tools to help everyone understand the performance of a page or site. Historically, these tools have used different analysis engines. Unfortunately, this caused some confusion because the…
Notifying users of unclear subscription pages
Unclear mobile subscriptions
Clearer billing information for Chrome users
- Is the billing information visible and obvious to users? For example, adding no subscription information on the subscription page or hiding the information is a bad start because users should have access to the information when agreeing to subscribe.
- Can customers easily see the costs they’re going to incur before accepting the terms? For example, displaying the billing information in grey characters over a grey background, therefore making it less readable, is not considered a good user practice.
- Is the fee structure easily understandable? For example, the formula presented to explain how the cost of the service will be determined should be as simple and straightforward as possible.
If your billing service takes users through a clearly visible and understandable billing process as described in our best practices, you don’t need to make any changes. Also, the new warning in Chrome doesn’t impact your website’s ranking in Google Search.
Introducing reCAPTCHA v3: the new way to stop bots
A Frictionless User Experience
Over the last decade, reCAPTCHA has continuously evolved its technology. In reCAPTCHA v1, every user was asked to pass a challenge by reading distorted text and typing into a box. To improve both user experience and security, we introduced reCAPTCHA v2 and began to use many other signals to determine whether a request came from a human or bot. This enabled reCAPTCHA challenges to move from a dominant to a secondary role in detecting abuse, letting about half of users pass with a single click. Now with reCAPTCHA v3, we are fundamentally changing how sites can test for human vs. bot activities by returning a score to tell you how suspicious an interaction is and eliminating the need to interrupt users with challenges at all. reCAPTCHA v3 runs adaptive risk analysis in the background to alert you of suspicious traffic while letting your human users enjoy a frictionless experience on your site.
More Accurate Bot Detection with “Actions”
In reCAPTCHA v3, we are introducing a new concept called “Action”—a tag that you can use to define the key steps of your user journey and enable reCAPTCHA to run its risk analysis in context. Since reCAPTCHA v3 doesn’t interrupt users, we recommend adding reCAPTCHA v3 to multiple pages. In this way, the reCAPTCHA adaptive risk analysis engine can identify the pattern of attackers more accurately by looking at the activities across different pages on your website. In the reCAPTCHA admin console, you can get a full overview of reCAPTCHA score distribution and a breakdown for the stats of the top 10 actions on your site, to help you identify which exact pages are being targeted by bots and how suspicious the traffic was on those pages.
Fighting Bots Your Way
Another big benefit that you’ll get from reCAPTCHA v3 is the flexibility to prevent spam and abuse in the way that best fits your website. Previously, the reCAPTCHA system mostly decided when and what CAPTCHAs to serve to users, leaving you with limited influence over your website’s user experience. Now, reCAPTCHA v3 will provide you with a score that tells you how suspicious an interaction is. There are three potential ways you can use the score. First, you can set a threshold that determines when a user is let through or when further verification needs to be done, for example, using two-factor authentication and phone verification. Second, you can combine the score with your own signals that reCAPTCHA can’t access—such as user profiles or transaction histories. Third, you can use the reCAPTCHA score as one of the signals to train your machine learning model to fight abuse. By providing you with these new ways to customize the actions that occur for different types of traffic, this new version lets you protect your site against bots and improve your user experience based on your website’s specific needs.
In short, reCAPTCHA v3 helps to protect your sites without user friction and gives you more power to decide what to do in risky situations. As always, we are working every day to stay ahead of attackers and keep the Internet easy and safe to use (except for bots).
Ready to get started with reCAPTCHA v3? Visit our developer site for more details. Posted by Wei Liu, Google Product Manager
Google is introducing its Product Experts Program!
Google’s Top Contributors () and Rising Stars (
) are some of our most active and helpful members on these forums. With over 100 members globally just for the Webmaster Forums (1000 members if you count all product forums), this community of experts helps thousands of people every year by sharing their knowledge and helping others get the most out of Google products.
Today, we’re excited to announce that we’re rebranding and relaunching the Top Contributor program as Google’s Product Experts program! Same community of experts, shiny new brand.
Over the following days, we’ll be updating our badges in the forums so you can recognize who our most passionate and dedicated Product Experts are:
Silver Product Expert: Newer members who are developing their product knowledge
Gold Product Expert: Trusted members who are knowledgeable and active contributors
Platinum Product Expert: Seasoned members who contribute beyond providing help through mentoring, creating content, and more
Product Expert Alumni: Past members who are no longer active, but were previously recognized for their helpfulness
More information about the new badges and names.
Those Product Experts are users who are passionate about Google products and enjoy helping other users. They also help us by giving feedback on the tools we all use, like the Search Console, by surfacing questions they think Google should answer better, etc… Obtaining feedback from our users is one of Google’s core values, and Product Experts often have a great understanding of what affects a lot of our users. For example, here is a blog post detailing how Product Expert feedback about the Search Console was used to build the new version of the tool.
Visit the new Product Experts program website to get information on how to become a Product Expert yourself, and come and join us on our Webmaster forums, we’d love to hear from you!
Written by Vincent Courson, Search Outreach team
The new Search Console is graduating out of Beta 🎓
Today we mark an important milestone in Search Console’s history: we are graduating the new Search Console out of beta! With this graduation we are also launching the Manual Actions report and a “Test Live” capability to the recently launched URL inspection tool, which are joining a stream of reports and features we launched in the new Search Console over the past few months.
Our journey to the new Search Console
We launched the new Search Console at the beginning of the year. Since then we have been busy hearing and responding to your feedback, adding new features such as the URL Inspection Tool, and migrating key reports and features. Here’s what the new Search Console gives you:
More data:
- Get an accurate view of your website content using the Index Coverage report.
- Review your Search Analytics data going back 16 months in the Performance report.
- See information on links pointing to your site and within your site using the Links report.
- Retrieve crawling, indexing, and serving information for any URL directly from the Google index using the URL Inspection Tool.
Better alerting and new “fixed it” flows:
- Get automatic alerts and see a listing of pages affected by Crawling, Indexing, AMP, Mobile Usability, Recipes, or Job posting issues.
- Reports now show the HTML code where we think a fix necessary (if applicable).
- Share information quickly with the relevant people in your organization to drive the fix.
- Notify Google when you’ve fixed an issue. We will review your pages, validate whether the issue is fixed, and return a detailed log of the validation findings.
Simplified sitemaps and account settings management:
- Let Google know how your site is structured by submitting sitemaps
- Submit individual URLs for indexing (see below).
- Add new sites to your account, invite and manage users.
Out of Beta
While the old Search Console still has some features that are not yet available in the new one, we believe that the most common use cases are supported, in an improved way, in the new Search Console. When an equivalent feature exists in both old and new Search Console, our messages will point users to the new version. We’ll also add a reminder link in the old report. After a reasonable period, we will remove the old report.
Read more about how to migrate from old to the new Search Console, including a list of improved reports and how to perform common tasks, in our help center.
Manual Actions and Security Issues alerts
To ensure that you don’t miss any critical alerts for your site, active manual actions and security issues will be shown directly on the Overview page in the new console. In addition, the Manual Actions report has gotten a fresher look in the new Search Console. From there, you can review the details for any pending Manual Action and, if needed, file a reconsideration request.
URL Inspection – Live mode and request indexing
The URL inspection tool that we launched a few months ago now enables you to run the inspection on the live version of the page. This is useful for debugging and fixing issues in a page or confirming whether a reported issue still exists in a page. If the issue is fixed on the live version of the page, you can ask Google to recrawl and index the page.
We’re not finished yet!
Your feedback is important to us! As we evolve Search Console, your feedback helps us to tune our efforts. You can still switch between the old and new products easily, so any missing functionality you need is just a few clicks away. We will continue working on moving more reports and tools as well as adding exciting new capabilities to the new Search Console.
Posted by Hillel Maoz and Yinnon Haviv, Engineering Leads, Search Console team
Collaboration and user management in the new Search Console
As part of our reinvention of Search Console, we have been rethinking the models of facilitating cooperation and accountability for our users. We decided to redesign the product around cooperative team usage and transparency of action history. The new Search Console will gradually provide better history tracking to show who performed which significant property-affecting modifications, such as changing a setting, validating an issue or submitting a new sitemap. In that spirit we also plan to enable all users to see critical site messages.
New features
- User management is now an integral part of Search Console.
- The new Search Console enables you to share a read-only view of many reports, including Index coverage, AMP, and Mobile Usability. Learn more.
- A new user management interface that enables all users to see and (if appropriate), manage user roles for all property users.
New Role definition
- In order to provide a simpler permission model, we are planning to limit the “restricted” user role to read-only status. While being able to see all information, read-only users will no longer be able to perform any state-changing actions, including starting a fix validation or sharing an issue.
Best practices
As a reminder, here are some best practices for managing user permissions in Search Console:
- Grant users only the permission level that they need to do their work. See the permissions descriptions.
- If you need to share an issue details report, click the Share link on that page.
- Revoke permissions from users who no longer work on a property.
- When removing a previous verified owner, be sure to remove all verification tokens for that user.
- Regularly audit and update the user permissions using the Users & Permissions page in new Search Console.
User feedback
As part of our Beta exploration, we released visibility of the user management interface to all user roles. Some users reached out to request more time to prepare for the updated user management model, including the ability of restricted and full users to easily see a list of other collaborators on the site. We’ve taken that feedback and will hold off on that part of the launch. Stay tuned for more updates relating to collaboration tools and changes on our permission models.
As always, we love to hear feedback from our users. Feel free to use the feedback form within Search Console, and we welcome your discussions in our help forums as well!
Posted by John Mueller, Google Switzerland
Links, Mobile Usability, and site management in the new Search Console
More features are coming to the new Search Console. This time we’ve focused on importing existing popular features from the old Search Console to the new product.
Links Report
Search Console users value the ability to see links to and within their site, as Google Search sees them. Today, we are rolling out the new Links report, which combines the functionality of the “Links to your site” and “Internal Links” reports on the old Search Console. We hope you find this useful!
Mobile Usability report
Mobile Usability is an important priority for all site owners. In order to help site owners with fixing mobile usability issues, we launched the Mobile Usability report on the new Search Console. Issue names are the same as in the old report but we now allow users to submit a validation and reindexing request when an issue is fixed, similar to other reports in the new Search Console.
Site and user management
To make the new Search Console feel more like home, we’ve added the ability to add and verify new sites, and manage your property’s users and permissions, directly in new Search Console using our newly added settings page.
Keep sending feedback
As always, we would love to get your feedback through the tools directly and our help forums so please share and let us know how we’re doing.
Posted by Ariel Kroszynski and Roman Kecher – Search Console engineers
Hey Google, what’s the latest news?
Since launching the Google Assistant in 2016, we have seen users ask questions about everything from weather to recipes and news. In order to fulfill news queries with results people can count on, we collaborated on a new schema.org structured data spe…
An update to referral source URLs for Google Images
Every day, hundreds of millions of people use Google Images to visually discover and explore content on the web. Whether it be finding ideas for your next baking project, or visual instructions on how to fix a flat tire, exploring image results can so…
How we fought webspam – Webspam Report 2017
We always want to make sure that when you use Google Search to find information, you get the highest quality results. But, we are aware of many bad actors who are trying to manipulate search ranking and profit from it, which is at odds with our core mission: to organize the world’s information and make it universally accessible and useful. Over the years, we’ve devoted a huge effort toward combating abuse and spam on Search. Here’s a look at how we fought abuse in 2017.
We call these various types of abuse that violate the webmaster guidelines “spam.” Our evaluation indicated that for many years, less than 1 percent of search results users visited are spammy. In the last couple of years, we’ve managed to further reduce this by half.
Google webspam trends and how we fought webspam in 2017
Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.
Working with users and webmasters for a better web
We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.
Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites’ Google Search presence more easily.
Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.
We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!
Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.
Introducing the Indexing API for job posting URLs
Last June we launched a job search experience that has since connected tens of millions of job seekers around the world with relevant job opportunities from third party providers across the web. Timely indexing of new job content is critical because m…