Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

4XX Errors

4xx errors (also referred to as 400 errors) occur when a page that once existed on a website is no longer live and has not been redirected elsewhere. These HTTP 4xx status codes (such as a 404 error) can also impact SEO. There are a number of ways search engines view and deal with pages displaying 4xx error codes.

Learn how Google Search views 400 errors on your website (and how to solve 4xx issues for better organic search results) in this collection of ‘SEO Office Hours’ notes.

For more on 4xx errors, see our article on how to find 404 errors on your site.

Further reading: The ABCs of HTTP Status Codes

Search Console Reports 404s Based on Current Index

Search Console reports 404 pages based on the last time they were crawled by Googlebot.

12 Jan 2018

Expired Content Can be Redirected, 404’d or Noindexed

There is no one correct way to deal with expired content on a site. If there is a relevant page that replaces the expired one then you can implement a 301 redirect. If there is no replacement page, then you can leave the page as 200 saying that it is no longer valid and use either a 404 or noindex after a period of time.

3 Nov 2017

Serve 503s When Site Down For Maintenance

Serve 503 response codes (temporarily unavailable) instead of 404 errors when site is down for maintenance.

17 Oct 2017

Google Can Assume Server Error For Non-Loading Page Components

If a page has different components generating HTML and one of them looks like it’s generating a server error, Google may assume the page is returning a server error and drop it from search e.g. comment section that sometimes doesn’t load. Usually these are flagged as soft 404s in Search Console.

19 Sep 2017

Increase in GSC 403 Errors Could be Due to CMS Bot or Denial of Service Protection

An influx of 403 Forbidden errors in Search Console could be due bot protection or denial of service protection built in by a site’s host or CMS. This might be triggered if Google fetches a lot of pages in a short space of time. John recommends checking server logs to get a better understanding of what is happening. Fetch & Render can be used to check if Googlebot is blocked from specific URLs.

5 Sep 2017

Soft 404 Pages May Be Indexed then Later Dropped Out of the Index

Google initially indexes pages which then might be classified as Soft 404 pages, and then drops them from the index when they have processed the content.

5 May 2017

Google Doesn’t Care About 404 Page Quality

Google doesn’t care how good your 404 page is or what content you include.

10 Feb 2017

Links to 404 Pages Don’t Influence Penguin

If you 404 a page, all links into the page are dropped from the link graph, and won’t impact Penguin algorithm.

1 Nov 2016

404 Pages Can be Reindexed as New Page Unlike Noindex

If a page has been permanently removed by responding with an error, if it comes back, the page will be treated like a new page, unlike a noindex.

23 Sep 2016

Return 404 or Noindex for Empty Pages

Google prefers you to return a 404 status or add a noindex for empty pages, like category pages with no items listed, even if temporary.

9 Sep 2016

Back 3/5 Next