Google Webmaster Hangout Notes: June 2nd 2017

Sam Marsden
Sam Marsden

On 11th June 2017 • 3 min read

Notes from the Google Webmaster Hangout on the 2nd of June, 2017.

 

Report Sites Scraping Your Content to Google on Page-By-Page Basis

If your site has been scraped you can submit a DMCA takedown to the website’s hosting service, and to Google’s legal team can investigate. This is required on page level and cannot be done at a site level.

 

Redirect Desktop Users From AMP Pages to Desktop Pages

It’s not a problem to redirect non-mobile users from amp page to it’s desktop canonical but e careful when redirecting Googlebot desktop.

 

Google Understands Synonyms Algorithmically Through Search Behaviour

Google tries to understand words that are synonyms of each other in an algorithmic way, such as those which use diacritic characters with accents, based on what people are searching for. Google doesn’t have a linguistic model to say determine how characters and words map to each other.

 

Split up Sitemaps up to Identify Pages Indexed by Google

There is no way to get information on which specific URLs are indexed in Google. If you want to see what URLs have been indexed by Google, you can split the sitemap up into smaller parts. However, you shouldn’t focus on getting high numbers of URLs indexed, but more on the relevance of indexed pages and content.

 

Ensure Dynamically Served Sites Show Full Content on Mobile for Mobile-First

With dynamic serving, Googlebot might not see full the content when they move to mobile-first, which would mean less to look at for indexing. Responsive design is less of an issue as has the same HTML across devices.

 

Google Algorithm Changes Are Applied Universally to All Languages

When Google makes algorithm changes, they try to make them across all sites regardless of differences like language. Exceptions with rich snippets, structured data, knowledge graph, business related information, streaming services can be trickier for legal reasons so they can’t roll updates out all at the same time.

 

Search Console Crawl Errors May Take a Year to Update

It takes some time to report crawl errors in Search Console, and detect when they have updated, potentially up to a year for some URLs.

 

Crawl Rate Increases After Site-Wide Changes

Google crawls site-wide changes a bit faster than they normally would.

 

Google Recommends Pre-Rendered HTML With Javascript Frameworks

Google recommends using isomorphic Javascript or, Universal Angular on Angular, so first page is pre-rendered with html. This has the effect of the page loading very quickly and search engines won’t have to process Javascript.

 

An Increase in 5XX Errors May be Reduce the Crawl Rate

Google’s crawling systems will recognise increase in server errors (5XX errors) and assume they’re crawling too hard and will step back appropriately. If the URLs returning the errors are not valid pages, consider returning a 4xx status instead.

 

Structured Data Penalties Don't Affect Rankings

Penalties for spammy structured data only affects rich snippets, not ranking of site.

Author

Sam Marsden
Sam Marsden

Sam Marsden is DeepCrawl's SEO & Content Manager. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

 

Tags

Get the knowledge and inspiration you need to build a profitable business - straight to your inbox.

Subscribe today