Google Adjusts Crawl Rate Over Time Based on a Site’s Performance and Content
When determining the crawl rate for a website, Google will try to adjust it automatically over time, taking into account how fast the site returns results and how much content there is to crawl.
Google Can Crawl Different Parts of a Website at Different Speeds
Google is able to detect how frequently the different sections of a site are updated and crawl them at different speeds, so that the frequently changing pages are crawled more regularly.
Fluctuations in Googlebot’s Download Time Caused by Changes In Server Speed, Page Size and Page Complexity
If GSC is showing that Googlebot is spending more time downloading pages on a site, the pages may have become large, more complicated or it could mean that the server is slower.
There is No Way to Reduce Google’s Crawl Rate of Certain Pages
You can’t instruct Google to crawl pages less frequently, but instead you can put more emphasis on more important pages to be crawled by using internal linking.
Crawl Rate Only an Issue if Less Frequent Than How Often Content is Updated
Crawl rate is only a negative factor if Google crawls pages less frequently than the rate at which the pages are updated. Google only needs to crawl pages to pick up changes.
Having a Section of a Site That’s Hard to Crawl Can Reduce Crawling of Entire Site
Google determines crawl rate at a host level, so if two sections of a website (i.e. ecommerce shop and a blog) sit on the same host and one of them is slow or difficult to crawl, Google will reduce the crawl rate of both sections.
Encourage Google to Crawl Pages Faster Using URL Inspection Tool
It is normal for Google to take a couple of months to reprocess changes to a page that isn’t a homepage, as these pages might not be crawled frequently. You can encourage Google to crawl a page a bit faster by testing the live URL and then requesting indexing in the URL Inspection tool.
Google Crawl Rate May Take Time to Build for New Websites
Googlebot’s maximum crawl rate is limited to prevent a site slowing down, but Google doesn’t crawl more than it needs to. New websites take take time to build trust to crawl and index new content as quickly as possible.
Average Response Times >1,000ms Can Limit Googlebot’s Crawling of a Site
John recommends that the average response time in GSC crawl stats should be around 100ms. A response time closer to 1,000ms could mean Googlebot isn’t able to crawl as much of a site as it ideally would.