Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google is able to determine and set an optimal crawl budget for individual sites, this is covered within our SEO Office Hours Notes along with further best practice advice.

Crawl Rate Only an Issue if Less Frequent Than How Often Content is Updated

Crawl rate is only a negative factor if Google crawls pages less frequently than the rate at which the pages are updated. Google only needs to crawl pages to pick up changes.

5 Mar 2019

Having a Section of a Site That’s Hard to Crawl Can Reduce Crawling of Entire Site

Google determines crawl rate at a host level, so if two sections of a website (i.e. e-commerce shop and a blog) sit on the same host and one of them is slow or difficult to crawl, Google will reduce the crawl rate of both sections.

22 Feb 2019

Encourage Google to Crawl Pages Faster Using URL Inspection Tool

It is normal for Google to take a couple of months to reprocess changes to a page that isn’t a homepage, as these pages might not be crawled frequently. You can encourage Google to crawl a page a bit faster by testing the live URL and then requesting indexing in the URL Inspection tool.

11 Dec 2018

Google Crawl Rate May Take Time to Build for New Websites

Googlebot’s maximum crawl rate is limited to prevent a site slowing down, but Google doesn’t crawl more than it needs to. New websites take take time to build trust to crawl and index new content as quickly as possible.

30 Nov 2018

Average Response Times >1,000ms Can Limit Googlebot’s Crawling of a Site

John recommends that the average response time in GSC crawl stats should be around 100ms. A response time closer to 1,000ms could mean Googlebot isn’t able to crawl as much of a site as it ideally would.

7 Sep 2018

Crawl Stats in GSC Can Indicate if Googlebot is Crawling Cautiously

Googlebot is more cautious about crawling sites if it detects the site’s server is struggling. The ‘number of pages crawled’ graph in GSC is a good way of diagnosing this crawl behaviour.

7 Sep 2018

Google May Take Longer to Process Redirects For Noindexed Pages

Google crawls noindexed pages less frequently. If a redirect is set up for a noindexed page, Google may take longer to process this because it is being crawled less frequently.

27 Jul 2018

Crawl Rate Has Nothing to do With Rankings

If Googlebot is able to crawl your site frequently, this doesn’t mean your site will rank any higher. Crawling and ranking are two separate things.

24 Jul 2018

503 Errors Reduce Crawl Rate and Crawling Stops if Robots.txt 503s

Search Console reporting a site as temporarily unreachable means the site is returning 503 errors. Googlebot will temporarily slow crawling if the site returns 503 errors and stop crawling if the robots.txt file 503s.

13 Jul 2018

Google Increases Crawl Rate For Frequently Changing Pages

Google tries to recognise when content changes on a page because it helps them understand how quickly they need to recrawl it over time. If a page changes significantly on a regular basis Google will crawl that page more frequently.

15 Jun 2018

Back 2/6 Next