4XX Errors

4xx errors (also referred to as 400 errors) occur when a page that once existed on a website is no longer live and has not been redirected elsewhere. These HTTP 4xx status codes (such as a 404 error) can also impact SEO. There are a number of ways search engines view and deal with pages displaying 4xx error codes.
 
Learn how Google Search views 400 errors on your website (and how to solve 4xx issues for better organic search results) in this collection of ‘SEO Office Hours’ notes.
 
For more on 4xx errors, see our article on how to find 404 errors on your site.

4xx Errors Don’t Mean Your Crawl Budget is Being Wasted

February 20, 2018 Source

Seeing Googlebot crawling old 404/410 pages doesn’t mean your crawl budget is being wasted. Google will revisit these when there is nothing else on the site to be crawled, which is a sign of capacity to crawl more.


Search Console Reports 404s Based on Current Index

January 12, 2018 Source

Search Console reports 404 pages based on the last time they were crawled by Googlebot.


404 Errors Don’t Impact Ranking of Other Pages

January 12, 2018 Source

404 errors don’t affect the ranking of other pages on your site, this is the normal response to return for pages that don’t exist.


Expired Content Can be Redirected, 404’d or Noindexed

November 3, 2017 Source

There is no one correct way to deal with expired content on a site. If there is a relevant page that replaces the expired one then you can implement a 301 redirect. If there is no replacement page, then you can leave the page as 200 saying that it is no longer valid and use either a 404 or noindex after a period of time.


Serve 503s When Site Down For Maintenance

October 17, 2017 Source

Serve 503 response codes (temporarily unavailable) instead of 404 errors when site is down for maintenance.


Google Can Assume Server Error For Non-Loading Page Components

September 19, 2017 Source

If a page has different components generating HTML and one of them looks like it’s generating a server error, Google may assume the page is returning a server error and drop it from search e.g. comment section that sometimes doesn’t load. Usually these are flagged as soft 404s in Search Console.


Increase in GSC 403 Errors Could be Due to CMS Bot or Denial of Service Protection

September 5, 2017 Source

An influx of 403 Forbidden errors in Search Console could be due bot protection or denial of service protection built in by a site’s host or CMS. This might be triggered if Google fetches a lot of pages in a short space of time. John recommends checking server logs to get a better understanding of what is happening. Fetch & Render can be used to check if Googlebot is blocked from specific URLs.


Soft 404 Pages May Be Indexed then Later Dropped Out of the Index

May 5, 2017 Source

Google initially indexes pages which then might be classified as Soft 404 pages, and then drops them from the index when they have processed the content.


Google Doesn’t Care About 404 Page Quality

February 10, 2017 Source

Google doesn’t care how good your 404 page is or what content you include.


Related Topics

5XX Errors Redirects