Notes from the Google Webmaster Hangout on the 12th of December 2017.

Google TLS Warnings Won't Impact Rankings

TLS warnings are being sent in GSC to alert webmasters about common issues including HTTPS or certificate configuration problems. Receiving this warning doesn’t impact rankings.

Drop in Indexed Pages Could be Google Removing Duplicates

Google recognises duplicate pages within the GSC index status report. As URLs are reprocessed over time, Google realises that they don’t need to be kept so this can be one reason behind a drop in indexed pages and is commonly seen following HTTPS migrations.

Use Several Smaller Sitemaps to Locate Indexing Issues

Having several smaller sitemaps for the different sections of your site is recommended for diagnosing indexing issues.

New Search Console Will Show More Sitemap Data

The new Search Console will show more detailed information regarding sitemaps and more detail per sitemap file.

Nofollow Obsolete When Noindex Already Present

When a page is noindexed, not only will it be removed from the index but also over time all of the links associated with the page will be removed from the link graph so nofollow is made obsolete.

Low Quality Page Signals Aren't Kept if Page is Blocked From Crawling

Negative signals from a low quality page aren’t kept within the site if Google has been blocked from crawling it. When the page is dropped from the index the site will be re-evaluated as it is continuously being crawled.

Pages That Redirect to Interstitials Are Dropped from the Index

Make sure pages don’t redirect to interstitials and that they’re kept on the same URL or Googlebot could drop these pages from the index.

Rewrite Longer Meta Descriptions to Showcase USPs & Best Fulfil Query

Google tries to use the meta description provided, but if more information or context is needed to fulfil the user’s query then other sections of the page’s content will be included as well. There are no ranking benefits of tweaking or lengthening descriptions, but they need to explain what your service is and your USPs.

Geo-targeted Sites Should Feature Content Which Isn't Location-Specific

Googlebot crawls from California, so if you make all of your content geo-specific then it will only ever see content about California and won’t realise you have a global website, which can affect rankings for other locations. John recommends having a significant part of your site that’s general. This also applies for personalisation.

Remove CSS Stretching Images to Max Viewport to See How Full Page Renders

If the CSS sets images to the max viewport this will stretch them and stop you being able to see how the rest of the page is rendering in Fetch as Google. However, this view doesn’t represent what Googlebot is actually rendering.