Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Sitemaps

A sitemap is a list of all of the live URLs which exist on a site and is used to inform search engine crawlers of the most important pages and therefore which ones should be crawled and indexed.

There are several things to consider when creating sitemaps, as well as understanding how search engines view them. We cover a range of these topics within our SEO Office Hours Notes below, along with best practice recommendations and Google’s advice on sitemaps.

For more on sitemaps and SEO, check out our article:  How to Improve Website Crawlability with Sitemaps.

RSS with Pubsubhubbub to get URLs Indexed

You can use an RSS feed with Pubsubhubbub to ping Google with new URLs as an alternative to Sitemaps.

10 Feb 2017

Use Sitemap Index Count to Measure Indexing

Site: is not an accurate measure of indexed pages, and could be off by ‘orders of magnitude’. The Sitemap Index count in Search Console is accurate for the specific URLs submitted.

2 Dec 2016

Submit Old URLs in Sitemaps when Moving Domains

When moving domains, you can submit old redirecting URLs in a Sitemap with a changed date, on either domain, to help get them re-crawled more quickly.

29 Nov 2016

Sitemaps on Separate Domains Require Both Sites to be Verified in Search Console

You can submit Sitemaps with URLs on a different domain if both of the site are verified in the Search Console account.

23 Sep 2016

Search Console Indexed URLs Counts Report Exact URLs in Sitemaps

Search Console indexed pages in Sitemaps uses exact URLs, so variations including www/non-www, trailing slash variations, etc, won’t be reported as indexed.

20 Sep 2016

Sitemaps Hosted in Subdirectories are Only Valid for URLs in the Same Directory

Sitemaps located in a subdirectory only be valid for URLs in that sub-directory.

20 Sep 2016

Google Measures Sitemap Trust

Google has a trust rating per Sitemap, based on incorrect use of last modified data. Google can learn that the last modified date is providing useful information which increases the trust. Otherwise it will start to ignore the last modified dates.

9 Sep 2016

Crawl Errors Priority Metric includes Mixture of Signals

The priority metric for crawl errors in search console is a mixture of pages being returned in search results, included in Sitemaps, and if it has internal links. The higher the priority are the ones Google thinks might have content which Google wants to index.

9 Sep 2016

Add Last Modified to Redirects in Sitemaps

When Redirecting URLs include them in a Sitemap with a last modified date set after the redirect was put in place, it will encourage them to be crawled more quickly

9 Aug 2016

Last Modified In Sitemaps Aids Crawling

Google thinks the Last Modified date in an XML Sitemap can be very useful to help them recrawl URLs, and they also support RSS and Atom feeds.

8 Jul 2016

Back 8/10 Next