Sitemaps

A sitemap is a list of all of the live URLs which exist on a site and is used to inform search engine crawlers of the most important pages and therefore which ones should be crawled and indexed. There are several things to consider when creating sitemaps, as well as understanding how search engines view them. We cover a range of these topics within our Hangout Notes, along with best practice recommendations and advice from Google.

Google Ignores Irrelevant Sitemap Content

July 8, 2016 Source

Google will ignore any information in Sitemaps which it doesn’t recognise, so you can include additional information for other purposes.


No Good Solution for Reactivating Pages

July 1, 2016 Source

If you have pages which expire but are reactivated after a period of time, there isn’t really a good solution, but you can use a Sitemap to tell Google about URLs which are now active, and use the unavailable-after meta tag.


Mobile Sites Don’t Need Sitemaps

May 17, 2016 Source

Separate mobile sites should be canonicalising to the desktop page, so you don’t need to submit them to Google via a Sitemap, but it’s still worth adding to Search Console.


HTML sitemaps help indexing and crawling

February 26, 2016 Source

If you have a complicated website, providing a mapping of your category pages can help Google to find pages and understand the structure of a website.


Cross Domain Sitemaps Will Be Crawled If Present in Robots.txt

February 23, 2016 Source

Google will use Sitemaps hosted on an external domain if they are referenced in the robots.txt.


RSS + PubSubHubbub is better than XML sitemaps

September 11, 2015 Source

John recommends using RSS with PubSubHubbub as the fastest way to get new content indexed.


Submit Canonical URLs in Sitemaps and Other References

April 24, 2015 Source

Use consistent canonical URLs in Sitemaps and other internal references.


Use Sitemaps for Redirect Discovery

April 24, 2015 Source

If you want Google to see your redirected URLs, such as after a URL change, it’s OK to submit the old URLs in a Sitemap to help Google recrawl them more quickly


Break XML Sitemaps into Small Chunks

December 23, 2014 Source

Breaking up XML Sitemaps into smaller groups can give you more feedback on indexing issues, which are reported separately for each Sitemap in Webmaster Tools.


Related Topics

Crawling Indexing Crawl Budget Crawl Errors Crawl Rate Disallow Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render