Crawl Rate

Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google are able to determine and set an optimal crawl budget for individual sites, this is covered within our Hangout Notes along with further best practice advice.

Fetch Time Impacts Crawl Rate

October 16, 2015 Source

Google recommends a page fetch time of less than 1 second, as sites which respond more quickly will get a higher crawl rate. This won’t directly help you rank better but might help get you ranked more quickly.


500 Error Pages May Impact Crawl Rate and Will Eventually be Treated as 404

October 16, 2015 Source

500 errors can impact ranking. They might result in a lower crawl rate. If they are persistent, they will be treated like a 404 and dropped. Also, Google won’t see the content of a 500 page so you can’t use a meta noindex to get Google to drop those pages if that’s what you want.


Fresh links increase crawling of old pages

August 14, 2015 Source

If google see’s a new link to an old page, it’s more likely to crawl that page more frequently. So adding new links to re-activated pages should get them discovered more quickly.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render