Notes from the Google Webmaster Hangout on the 11th of January 2019.
Google Will Deprecate Some Features From Old Version of GSC
Google will continue to migrate features from the old version of Search Console to the new version, however some features won’t be migrated and will be deprecated e.g. the Crawl Errors section.
Googlebot May Not be Able to Properly Crawl Server-side Rendered Pages Where Only the Content is Rendered
Google is able to crawl server-side rendered pages if all functionality is available in the static HTML. However, if only the content is rendered and the links and structured data aren’t then Google won’t be able to crawl it as well.
Google Improving Warnings in GSC to Trigger Based on Stable Version of Site
Google are working on improving Search Console so that warnings are triggered based on the stable version of the website. John has seen instances where Google is temporarily unable to fetch resources, which trigger a warning in Search Console.
Google Treats CDN Hosted Sites Same as Non-CDN Hosted Sites
Sites hosted using a CDN aren’t treated differently to non-CDN sites, as Google sees it as another way of hosting a site.
Google Treats XML Sitemaps Differently From HTML Pages
Google treats XML sitemaps differently from HTML pages, as they are a machine-readable file and not meant to be indexed by search engines.
Google Doesn’t Mind How Sitemaps Are Split up
Google combines separate sitemaps together so that they can be processed. This means it is up to webmasters to decide how they want to split up sitemaps.
GSC Mobile Usability & Structured Data Reports Feature Representative Sample of URLs
The Mobile Usability report and Structured Data reports in Search Console are based on a significant sample of the pages on your site and aren’t meant to be a comprehensive list.
ccTLDs Can be Shown to Global Audience But Can’t Geotarget Other Countries
Sites with a ccTLD can be relevant to countries outside of the one associated with the TLD. However, a ccTLD won’t be able to geo-target other specific countries.
Pages Blocking US Access Also Need to Block Googlebot to Avoid Cloaking
If you need to block content from being accessed in the US or California, then you would need to block Googlebot as well, otherwise Google will see this as cloaking. One option might be to provide some general information that can be seen by visitors in the US.
Google May Filter Out Parts of Meta Descriptions if Misleading or Spammy
Google may filter out parts of meta descriptions if they are deemed them to be spammy or misleading.
Be the First to Know About the Latest Insights From Google