Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Google May Take Longer to Process Redirects For Noindexed Pages

July 27, 2018 Source

Google crawls noindexed pages less frequently. If a redirect is set up for a noindexed page, Google may take longer to process this because it is being crawled less frequently.

Crawl Rate Has Nothing to do With Rankings

July 24, 2018 Source

If Googlebot is able to crawl your site frequently, this doesn’t mean your site will rank any higher. Crawling and ranking are two separate things.

503 Errors Reduce Crawl Rate and Crawling Stops if Robots.txt 503s

July 13, 2018 Source

Search Console reporting a site as temporarily unreachable means the site is returning 503 errors. Googlebot will temporarily slow crawling if the site returns 503 errors and stop crawling if the robots.txt file 503s.

Google Increases Crawl Rate For Frequently Changing Pages

June 15, 2018 Source

Google tries to recognise when content changes on a page because it helps them understand how quickly they need to recrawl it over time. If a page changes significantly on a regular basis Google will crawl that page more frequently.

Googlebot Can Recognise Faceted Navigation & Slow Down Crawling

April 3, 2018 Source

Googlebot understands URL strucures well and can recognise faceted navigation and will slow down when it realises where the primary content is and where it has strayed from that. This is aided by GSC parameter handling.

Googlebot Crawls Known URLs Faster Following Significant Site Changes

March 9, 2018 Source

Googlebot will crawl known URLs faster if it detects that significant changes have been made across the site to things like structured data, rel canonicals or redirects.

Images Crawled Less Frequently Than Pages

February 1, 2018 Source

Images are crawled less frequently than content, so if these are migrated, to a CDN for example, it will take Googlebot longer to recrawl these.

Crawl Rate Issues Can be Reported For Manual Review

February 1, 2018 Source

If a site is being crawled too much or too little you can report a problem with Googlebot in the Help Center and this can be adjusted by a Google engineer.

Google Temporarily Reduces Crawl Rate After A Platform Change

February 1, 2018 Source

When a site changes its hosting platform Google will be more conservative about how much they crawl as they don’t know how much load the new server set up will support.

Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render