Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Google Increases Crawl Rate For Frequently Changing Pages

June 15, 2018 Source

Google tries to recognise when content changes on a page because it helps them understand how quickly they need to recrawl it over time. If a page changes significantly on a regular basis Google will crawl that page more frequently.


Googlebot Can Recognise Faceted Navigation & Slow Down Crawling

April 3, 2018 Source

Googlebot understands URL strucures well and can recognise faceted navigation and will slow down when it realises where the primary content is and where it has strayed from that. This is aided by GSC parameter handling.


Googlebot Crawls Known URLs Faster Following Significant Site Changes

March 9, 2018 Source

Googlebot will crawl known URLs faster if it detects that significant changes have been made across the site to things like structured data, rel canonicals or redirects.


Crawl Rate Issues Can be Reported For Manual Review

February 1, 2018 Source

If a site is being crawled too much or too little you can report a problem with Googlebot in the Help Center and this can be adjusted by a Google engineer.


Google Temporarily Reduces Crawl Rate After A Platform Change

February 1, 2018 Source

When a site changes its hosting platform Google will be more conservative about how much they crawl as they don’t know how much load the new server set up will support.


Images Crawled Less Frequently Than Pages

February 1, 2018 Source

Images are crawled less frequently than content, so if these are migrated, to a CDN for example, it will take Googlebot longer to recrawl these.


Crawl Frequency Attribute in XML Sitemaps Doesn’t Impact Crawl Rate

December 15, 2017 Source

Google takes no notice of the crawl frequency attribute in XML sitemaps or any priority set. Only the last modification timestamp will impact crawl rate.


Google Uses Scheduler to Determine Recrawl Date

December 15, 2017 Source

Google uses a scheduler before crawling to work out when they need to recrawl URLs. Google will increase crawl rate if it gets signals that it needs to do so e.g. updated modification date in sitemaps and internal linking (especially from the homepage).


Setting A Higher Crawl Rate Doesn’t Guarantee Google Will Crawl More

November 28, 2017 Source

Setting your crawl rate to high means Google can crawl more but that doesn’t mean they will.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render