Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google is able to determine and set an optimal crawl budget for individual sites, this is covered within our SEO Office Hours Notes along with further best practice advice.

The system for slowing down crawl rate is more responsive in slowing down than ramping back up

A site owner was mentioning to John that they saw a drop in crawl rate which was attributed to high response times. They’ve since fixed this, but the crawl rate has not changed. John explained that it may just need longer to catch up, but there is also a form within the Google Search Console Help Center where you can request someone to take a look at the crawling of your website—they might be able to adjust it manually.

17 Nov 2021

Average Fetch Time May be Affected by Groups of Slower Pages

If Google is spending more time crawling a particular group of slow pages then it may make the average fetch time and crawled data look worse.

20 Mar 2020

Rendered Page Resources Are Included in Google’s Crawl Rate

The resources that Google fetches when they render a page are included in Google’s crawling budget and reported in the Crawl Stats data in Search Console.

20 Mar 2020

Sites With Slow Response Pages are Crawled Less

If Google can’t recrawl pages quickly enough due to poor page response times then they won’t recrawl pages as often as they would like to.

20 Mar 2020

Search Console Crawl Stats Include URLs Fetched by Other Google Products

Google’s crawl stats in Search Console are an accurate reflections of Google’s own crawling logs, but they include URLs fetched from other Google services which use the same infrastructure as Googlebot, including Google Ads landing page checks and product search crawling.

6 Mar 2020

Upper Limit For Recrawling Pages is Six Months

Google tends to recrawl pages at least once every six months as an upper limit.

22 Jan 2020

Google Adjusts Crawl Rate Over Time Based on a Site’s Performance and Content

When determining the crawl rate for a website, Google will try to adjust it automatically over time, taking into account how fast the site returns results and how much content there is to crawl.

27 Dec 2019

Google Can Crawl Different Parts of a Website at Different Speeds

Google is able to detect how frequently the different sections of a site are updated and crawl them at different speeds, so that the frequently changing pages are crawled more regularly.

3 Sep 2019

Fluctuations in Googlebot’s Download Time Caused by Changes In Server Speed, Page Size and Page Complexity

If GSC is showing that Googlebot is spending more time downloading pages on a site, the pages may have become large, more complicated or it could mean that the server is slower.

1 May 2019

There is No Way to Reduce Google’s Crawl Rate of Certain Pages

You can’t instruct Google to crawl pages less frequently, but instead you can put more emphasis on more important pages to be crawled by using internal linking.

5 Mar 2019

Back 1/6 Next