Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Crawl Frequency Attribute in XML Sitemaps Doesn’t Impact Crawl Rate

December 15, 2017 Source

Google takes no notice of the crawl frequency attribute in XML sitemaps or any priority set. Only the last modification timestamp will impact crawl rate.


Google Uses Scheduler to Determine Recrawl Date

December 15, 2017 Source

Google uses a scheduler before crawling to work out when they need to recrawl URLs. Google will increase crawl rate if it gets signals that it needs to do so e.g. updated modification date in sitemaps and internal linking (especially from the homepage).


Setting A Higher Crawl Rate Doesn’t Guarantee Google Will Crawl More

November 28, 2017 Source

Setting your crawl rate to high means Google can crawl more but that doesn’t mean they will.


Views of Cached AMP Pages Won’t Appear in Log Files

October 31, 2017 Source

AMP pages may make log files look unusual as Google is requesting the page to update the cache but the page will appear to receive little traffic because users see the cached version of the page.


Moving to a CDN May Temporarily Reduce Crawl Rate

October 6, 2017 Source

If you move your site to a CDN, Google sees it as a new hosting setup and may temporarily reduce the crawl rate.


A Noindex Reduces Crawl Rate

October 6, 2017 Source

A page with a noindex tag will be crawled less frequently.


Parameters Are Hint to Google Not to Crawl as Frequently

September 22, 2017 Source

Parameters in Search Console aren’t a directive, more of a hint to Google not to crawl these as frequently. Blocking URLs in Robots.txt is strong directive to say Google should not be looking at these pages e.g. if Google’s crawls are crashing your site.


Submit Sitemap With Updated Last Modification Date For Faster Crawling of Updated Pages

August 11, 2017 Source

Submit a sitemap file with an update last modification date to speed up the process of crawling and indexing of pages that have been changed.


Google Recommends Keeping Redirects For At Least One Year

August 11, 2017 Source

Google recommends keeping redirects for one year minimum. This is because URLs will, in the worst case scenario, be crawled by Googlebot every six months, so a having the redirect in place for a year would give Google the chance to see that redirect at least twice.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render