Google Can Crawl Different Parts of a Website at Different Speeds
Google is able to detect how frequently the different sections of a site are updated and crawl them at different speeds, so that the frequently changing pages are crawled more regularly.
Google Determines if Pages Need to be Rendered by Comparing Content Found in Initial HTML & Rendered DOM
Google compares the content of the raw HTML of a page from the initial crawl to the rendered DOM after rendering to see if there is new content and to determine if it needs to be rendered going forward.
URL Removal Tool Hides Pages But Doesn’t Impact Crawling or Indexing
The URL Removal Tool only hides a page from the search results. Nothing is changed with regards to the crawling and indexing of that page.
Using International IP Redirects Will Prevent Google From Finding Other Versions of A Site
If you are redirecting based on international IP addresses, Google is likely to only see the redirect to the English version and would drop all of the other versions.
External Links Are More Critical for Initial Content Discovery & Crawling
External links are useful for helping Google find and crawl new websites, but they become less important to Google once it has already discovered the site in question.
Images Implemented Via Lazy Loading Can be Used Like Any Other Image on a Page
Images implemented via the lazy load script can be added to structured data and sitemaps without any issues, as long as they are embedded in a way that Googlebot is able to pick up.
Google Doesn’t Need To Be Able To Crawl The Add to Cart Pages of A Site
It is not essential for Google to crawl the add to cart pages on e-commerce sites, so this shouldn’t affect a site’s performance in search for purchase intent queries.
Googlebot Does Crawl From a Handful of Regional IPs
Googlebot does crawl from a small number of regional IPs, particularly in countries where they know it is hard to crawl from the US.
An Updated User Agent is Expected to Reflect The New Modern Rendering Infrastructure
Google has been experimenting with the current user agent settings and is re-thinking the set u. John expects some changes to be announced in the future around an updated user agent so that it reflects the new modern rendering infrastructure.