Google May Take Longer to Process Redirects For Noindexed Pages
Google crawls noindexed pages less frequently. If a redirect is set up for a noindexed page, Google may take longer to process this because it is being crawled less frequently.
Crawl Rate Has Nothing to do With Rankings
If Googlebot is able to crawl your site frequently, this doesn’t mean your site will rank any higher. Crawling and ranking are two separate things.
503 Errors Reduce Crawl Rate and Crawling Stops if Robots.txt 503s
Search Console reporting a site as temporarily unreachable means the site is returning 503 errors. Googlebot will temporarily slow crawling if the site returns 503 errors and stop crawling if the robots.txt file 503s.
Google Increases Crawl Rate For Frequently Changing Pages
Google tries to recognise when content changes on a page because it helps them understand how quickly they need to recrawl it over time. If a page changes significantly on a regular basis Google will crawl that page more frequently.
Googlebot Can Recognise Faceted Navigation & Slow Down Crawling
Googlebot understands URL strucures well and can recognise faceted navigation and will slow down when it realises where the primary content is and where it has strayed from that. This is aided by GSC parameter handling.
Googlebot Crawls Known URLs Faster Following Significant Site Changes
Googlebot will crawl known URLs faster if it detects that significant changes have been made across the site to things like structured data, rel canonicals or redirects.
Images Crawled Less Frequently Than Pages
Images are crawled less frequently than content, so if these are migrated, to a CDN for example, it will take Googlebot longer to recrawl these.
Crawl Rate Issues Can be Reported For Manual Review
If a site is being crawled too much or too little you can report a problem with Googlebot in the Help Center and this can be adjusted by a Google engineer.
Google Temporarily Reduces Crawl Rate After A Platform Change
When a site changes its hosting platform Google will be more conservative about how much they crawl as they don’t know how much load the new server set up will support.