Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: December 21st 2018

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 21st of December 2018.
 

Image Sitemaps Help Google Understand Which Images You Want to Be Indexed

Google can find images to index in the source code, but image sitemaps can help them to confirm which images you want to be indexed.


 

Google Chooses a Higher Resolution Image Where Multiple Versions Exist

If you have both high and low resolution versions of images, Google will prefer the higher resolution version.


 

Google Cache Shows Static HTML, Not JavaScript Rendered HTML

Google cache shows the static HTML version of a page, so if the whole page is built using JavaScript, it may not be shown in the cache.


 

Google Re-Evaluates Pages if the URL Changes

If you move a page to a new URL, Google will re-evaluate it and the rankings may change temporarily.


 

Expect to See Rankings Drop When Sites Are Merged

If you merge websites, you should expect to see some rankings drop, as Google doesn’t just add up all the signals, and the pages need to be re-evaluated on the new site. Ensuring you have correctly redirected every page should help the signals be moved over.


 

Unlinked Brand Mentions Don’t Pass Any Value

If a web page mentions your brand, but doesn’t link to your site, it won’t pass any value.


 

Block Staging Sites From Being Crawled by Google

You should block Google from indexing your staging site as it can cause problems. You can block access based on Googlebot’s user agent, or using robots.txt.


 

Structured Data Added via JavaScript Will Take Longer to Process

Google will pick up structured data markup added via JavaScript, such as Google Tag manager, but there may be a delay before Google renders the page with JavaScript. It can be more difficult to diagnose technical problems, so John recommends including it directly in the page.


 

Unresolved Manual Action Penalties May Take Years to Expire

If you ignore a manual action penalty, it may take years to expire. You should try to fix the cause, and submit a reconsideration request.


 

Crawl Budget Limitations May Delay JavaScript Rendering

Sometimes the delay in Google’s JavaScript rendering is caused by crawl budget limitations. Google is actively working on reducing the gap between crawling pages, and rendering them with JavaScript, but it will take some time, so they recommend dynamic, hybrid or server-side rendering content for sites with a lot of content.


 

Google Supports X-Robots Noindex to Block Images for Googlebot

Google respects x-robots noindex in image response headers.


 

Be the First to Know About the Latest Insights From Google

Avatar image for Rachel Costello
Rachel Costello

SEO & Content Manager

Rachel Costello is a former Technical SEO & Content Manager at Lumar. You'll most often find her writing and speaking about all things SEO.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox