Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Depth of Content Affects Crawl Rates

June 28, 2016 Source

If content is buried deep in the site, it might take longer for Google to discover it, or changes. so improving internal linking from higher levels will help pages be crawled faster.


Crawl Rate is Based on Pages Google Wants to Update

May 17, 2016 Source

Crawl rate is somewhere between minimum list of pages Google wants to update, and the maximum number of pages they think it’s safe to crawl without impacting performance. Any new pages discovered can be crawled provided there is some remaining budget, but might get queued up for the next day.


Crawl Rate Doesn’t Affect Rankings

February 26, 2016 Source

Assuming Google is able to pick up your content, there’s no ranking benefit for a page being crawled more frequently.


PDFs Are Crawled Less Frequently Than Pages

February 23, 2016 Source

PDFs won’t be crawled as often as HTML pages because the content is generally more stable.


Google Checks Robots.txt Once per Day

January 29, 2016 Source

Most sites have their robots checked once a day, you can speed this up manually by submitting it within the robots.txt tool in Search Console.


Site Traffic Doesn’t Impact Crawl Rate

December 1, 2015 Source

Traffic volume doesn’t affect crawl frequency, as Google don’t know this. If Google can recognise important pages which frequently contain links to new pages with unique content, they will be crawled more quickly. But crawl rate, or rate of change of a page, doesn’t have any direct relationship to ranking.


Fetch Time Impacts Crawl Rate

October 16, 2015 Source

Google recommends a page fetch time of less than 1 second, as sites which respond more quickly will get a higher crawl rate. This won’t directly help you rank better but might help get you ranked more quickly.


500 Error Pages May Impact Crawl Rate and Will Eventually be Treated as 404

October 16, 2015 Source

500 errors can impact ranking. They might result in a lower crawl rate. If they are persistent, they will be treated like a 404 and dropped. Also, Google won’t see the content of a 500 page so you can’t use a meta noindex to get Google to drop those pages if that’s what you want.


Fresh links increase crawling of old pages

August 14, 2015 Source

If google see’s a new link to an old page, it’s more likely to crawl that page more frequently. So adding new links to re-activated pages should get them discovered more quickly.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render