Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: June 11th 2019

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 11th of June 2019.

 
 

Extending Structured Data With Additional Fields Helps Google Better Understand Entities

You won’t see direct performance increases from extending structured data with additional property fields from schema.org, but this can help to give Google more context around the entities on your site.


 

No Need to Optimize Websites Specifically for Quality Raters

You don’t need to specifically optimize sites to be accessible for Google Quality Raters because they won’t be reviewing individual sites. The Quality Raters are simply given a list of SERPs with and without an algorithm change, and they then decide which set of results is better.


 

Make Sure Hosting & Redirects Are Set Up Correctly After Migration so Google Doesn’t Think Site is Offline

If part of your website doesn’t work when Google is trying to access it, such as www. pages, Google could assume that the site has gone offline. Ensure that redirects and the hosting is set up correctly to avoid this from happening.


 

Google Has an Upper Limit of Around 5,000 Internal Links Per Page For Crawling

Sites don’t normally exceed Google’s upper crawl limit for links on a page as it is quite high at around 5,000 links per page. However, John recommends only having necessary internal links so PageRank isn’t diluted too much.


 

Look Into Server-side Rendering For Improved UX as Dynamic Rendering is a Temporary Workaround for Crawlers

Dynamic rendering is a temporary workaround to allow search engines and social media crawlers to be able to access content even if they can’t render JavaScript. John foresees dynamic rendering being less useful in a few years as all crawlers get better at processing JavaScript, but look into server-side rendering for an improved experience for users.


 

Ensure Googlebot & Users Both See Same Primary Content to Avoid Cloaking Issues

Make sure the same primary content of a page is available for both users and search engines to mitigate potential risks of Google seeing your website as implementing cloaking.


 

Use Structured Data Testing Tool, Search Console Help Forum & Schema.org Documentation For Local Business Markup

For implementing local business markup, use the schema.org documentation to see which attributes are required, the Structured Data Testing Tool for testing that these work, and the Google Search Console forum for getting advice from peers.


 

Get Feedback From Real Users to Assess Your Site’s EAT

John recommends getting feedback from your users about the expertise, authority and trustworthiness of the content on your site to find areas for improvement.


 

Self-referencing Canonical Tags Are Best Practice But Not Critical

It’s best practice to have self-referencing canonical tags as the canonical tag is one of the signals that Google uses for selecting the primary page in a group of detected duplicates, however, it is not essential.


 

Use the Rich Snippet Spam Result Form to Alert Google to Sites Misusing Structured Data

If you see sites appearing for rich snippets which are misusing structured data and going against Google’s guidelines, you can submit a Rich Snippet Spam Result form to alert Google.


 

Google Can Process Hreflang Tags Regardless of How They’re Implemented

It doesn’t matter to Google whether you implement hreflang tags in the HTML, in the HTTP header, in an XML sitemap or through a combination of these methods. Google will process the information in these tags regardless of the implementation method.


 

News Site Shown in Forum Snippets Can Reformat Comment Section or Block Comments From Crawling

If a news site is being shown in forum snippets and this is problematic for you, either reformat the comments sections in a way that demotes the importance of this content or block comments from being crawlable by Google.


 

The Site Diversity Update Won’t Affect How Subdomains Are Crawled

The new change that was launched to show more diversity of sites in the search results won’t impact the way subdomains are currently crawled and processed, it will only impact how they are shown in the search results.


 

Be the First to Know About the Latest Insights From Google

Hangout notes

Avatar image for Rachel Costello
Rachel Costello

SEO & Content Manager

Rachel Costello is a former Technical SEO & Content Manager at Lumar. You'll most often find her writing and speaking about all things SEO.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox