Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: October 19th 2018

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 19th of October 2018.
 

If a Site Hasn’t Been Moved to Mobile-first Indexing Yet it Doesn’t Mean it’s Not Ready

Mobile-first indexing has been rolled out across many domains, but it doesn’t mean that a site isn’t ready if it hasn’t yet been moved over.


 

Homepages That Don’t Rank Well Might be Over-optimised

Instances where a homepage doesn’t rank well can sometimes be due to over-optimisation, with Google detecting keyword stuffing in the content.


 

Use Author Profile Pages & Article Markup to Inform Google About Content Authors

While authorship and Google Plus have been deprecated, you can let Google know about the authors of your content with author profile pages and by implementing article markup on your site.


 

Geotargeting AMP is Possible But Difficult to Implement

Theoretically, you can geotarget AMP pages to countries with poor connection speeds. However, this wouldn’t be easy to implement, as Google tries to keep a global view by associating an AMP version with the normal version of the page.


 

Including a Physical Address on a Site is Not a Ranking Signal

Google doesn’t use the inclusion of a physical address as the sign of a good website, however this can be important for users.


 

Check URLs Match Exactly When GSC Reports a URL is Not in a Sitemap

If the URL Inspection tool says a page is indexed but not submitted in the XML sitemap, John recommends checking that the exact URL seen in Search Console is present in the sitemap. For example, you should check that there are no differences with trailing slashes or the case used.


 

Invalid HTML Can Impact Google’s Ability to Understand Structured Data

Invalid HTML can make it difficult for Googlebot to understand which parts of a page belong together for structured data.


 

JavaScript Snippets Can Close Head & Cause Elements Below to be Overlooked

Be careful when adding JavaScript snippets to the head of a page, as this might cause the head to close prematurely and cause elements like hreflang links not to be processed, as they are seen as part of the body.


 

Google Only Crawls Hashed URLs When Unique Content Detected

Google doesn’t usually crawl hash URLs unless it has detected over time that there is something unique it needs to pick up from the hashed version.


 

Compressing Sitemaps Saves Bandwidth But Doesn’t Reduce Processing Time

Compressing sitemap files using Gzip can save bandwidth but doesn’t impact the speed that Googlebot processes these files.


 

Use X-Robots-Tag HTTP Header to Noindex Indexed Sitemap Files

If sitemap files are indexed for normal search queries, then you can use the X-Robots-Tag HTTP header to noindex all pages ending in .xml or .gz.


 

Be the First to Know About the Latest Insights From Google

Avatar image for Sam Marsden
Sam Marsden

SEO & Content Manager

Sam Marsden is Lumar's former SEO & Content Manager and currently Head of SEO at Busuu. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox