Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Crawl Rate Only an Issue if Less Frequent Than How Often Content is Updated

March 5, 2019 Source

Crawl rate is only a negative factor if Google crawls pages less frequently than the rate at which the pages are updated. Google only needs to crawl pages to pick up changes.


Having a Section of a Site That’s Hard to Crawl Can Reduce Crawling of Entire Site

February 22, 2019 Source

Google determines crawl rate at a host level, so if two sections of a website (i.e. ecommerce shop and a blog) sit on the same host and one of them is slow or difficult to crawl, Google will reduce the crawl rate of both sections.


Encourage Google to Crawl Pages Faster Using URL Inspection Tool

December 11, 2018 Source

It is normal for Google to take a couple of months to reprocess changes to a page that isn’t a homepage, as these pages might not be crawled frequently. You can encourage Google to crawl a page a bit faster by testing the live URL and then requesting indexing in the URL Inspection tool.


Google Crawl Rate May Take Time to Build for New Websites

November 30, 2018 Source

Googlebot’s maximum crawl rate is limited to prevent a site slowing down, but Google doesn’t crawl more than it needs to. New websites take take time to build trust to crawl and index new content as quickly as possible.


Average Response Times >1,000ms Can Limit Googlebot’s Crawling of a Site

September 7, 2018 Source

John recommends that the average response time in GSC crawl stats should be around 100ms. A response time closer to 1,000ms could mean Googlebot isn’t able to crawl as much of a site as it ideally would.


Crawl Stats in GSC Can Indicate if Googlebot is Crawling Cautiously

September 7, 2018 Source

Googlebot is more cautious about crawling sites if it detects the site’s server is struggling. The ‘number of pages crawled’ graph in GSC is a good way of diagnosing this crawl behaviour.


Google May Take Longer to Process Redirects For Noindexed Pages

July 27, 2018 Source

Google crawls noindexed pages less frequently. If a redirect is set up for a noindexed page, Google may take longer to process this because it is being crawled less frequently.


Crawl Rate Has Nothing to do With Rankings

July 24, 2018 Source

If Googlebot is able to crawl your site frequently, this doesn’t mean your site will rank any higher. Crawling and ranking are two separate things.


503 Errors Reduce Crawl Rate and Crawling Stops if Robots.txt 503s

July 13, 2018 Source

Search Console reporting a site as temporarily unreachable means the site is returning 503 errors. Googlebot will temporarily slow crawling if the site returns 503 errors and stop crawling if the robots.txt file 503s.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render