Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Google Can Crawl Different Parts of a Website at Different Speeds

September 3, 2019 Source

Google is able to detect how frequently the different sections of a site are updated and crawl them at different speeds, so that the frequently changing pages are crawled more regularly.


Fluctuations in Googlebot’s Download Time Caused by Changes In Server Speed, Page Size and Page Complexity

May 1, 2019 Source

If GSC is showing that Googlebot is spending more time downloading pages on a site, the pages may have become large, more complicated or it could mean that the server is slower.


There is No Way to Reduce Google’s Crawl Rate of Certain Pages

March 5, 2019 Source

You can’t instruct Google to crawl pages less frequently, but instead you can put more emphasis on more important pages to be crawled by using internal linking.


Crawl Rate Only an Issue if Less Frequent Than How Often Content is Updated

March 5, 2019 Source

Crawl rate is only a negative factor if Google crawls pages less frequently than the rate at which the pages are updated. Google only needs to crawl pages to pick up changes.


Having a Section of a Site That’s Hard to Crawl Can Reduce Crawling of Entire Site

February 22, 2019 Source

Google determines crawl rate at a host level, so if two sections of a website (i.e. ecommerce shop and a blog) sit on the same host and one of them is slow or difficult to crawl, Google will reduce the crawl rate of both sections.


Encourage Google to Crawl Pages Faster Using URL Inspection Tool

December 11, 2018 Source

It is normal for Google to take a couple of months to reprocess changes to a page that isn’t a homepage, as these pages might not be crawled frequently. You can encourage Google to crawl a page a bit faster by testing the live URL and then requesting indexing in the URL Inspection tool.


Google Crawl Rate May Take Time to Build for New Websites

November 30, 2018 Source

Googlebot’s maximum crawl rate is limited to prevent a site slowing down, but Google doesn’t crawl more than it needs to. New websites take take time to build trust to crawl and index new content as quickly as possible.


Average Response Times >1,000ms Can Limit Googlebot’s Crawling of a Site

September 7, 2018 Source

John recommends that the average response time in GSC crawl stats should be around 100ms. A response time closer to 1,000ms could mean Googlebot isn’t able to crawl as much of a site as it ideally would.


Crawl Stats in GSC Can Indicate if Googlebot is Crawling Cautiously

September 7, 2018 Source

Googlebot is more cautious about crawling sites if it detects the site’s server is struggling. The ‘number of pages crawled’ graph in GSC is a good way of diagnosing this crawl behaviour.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render