Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Average Fetch Time May be Affected by Groups of Slower Pages

March 20, 2020 Source

If Google is spending more time crawling a particular group of slow pages then it may make the average fetch time and crawled data look worse.


Rendered Page Resources Are Included in Google’s Crawl Rate

March 20, 2020 Source

The resources that Google fetches when they render a page are included in Google’s crawling budget and reported in the Crawl Stats data in Search Console.


Sites With Slow Response Pages are Crawled Less

March 20, 2020 Source

If Google can’t recrawl pages quickly enough due to poor page response times then they won’t recrawl pages as often as they would like to.


Search Console Crawl Stats Include URLs Fetched by Other Google Products

March 6, 2020 Source

Google’s crawl stats in Search Console are an accurate reflections of Google’s own crawling logs, but they include URLs fetched from other Google services which use the same infrastructure as Googlebot, including Google Ads landing page checks and product search crawling.


Upper Limit For Recrawling Pages is Six Months

January 22, 2020 Source

Google tends to recrawl pages at least once every six months as an upper limit.


Google Adjusts Crawl Rate Over Time Based on a Site’s Performance and Content

December 27, 2019 Source

When determining the crawl rate for a website, Google will try to adjust it automatically over time, taking into account how fast the site returns results and how much content there is to crawl.


Google Can Crawl Different Parts of a Website at Different Speeds

September 3, 2019 Source

Google is able to detect how frequently the different sections of a site are updated and crawl them at different speeds, so that the frequently changing pages are crawled more regularly.


Fluctuations in Googlebot’s Download Time Caused by Changes In Server Speed, Page Size and Page Complexity

May 1, 2019 Source

If GSC is showing that Googlebot is spending more time downloading pages on a site, the pages may have become large, more complicated or it could mean that the server is slower.


Crawl Rate Only an Issue if Less Frequent Than How Often Content is Updated

March 5, 2019 Source

Crawl rate is only a negative factor if Google crawls pages less frequently than the rate at which the pages are updated. Google only needs to crawl pages to pick up changes.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render