Google Measures Sitemap Trust
Google has a trust rating per Sitemap, based on incorrect use of last modified data. Google can learn that the last modified date is providing useful information which increases the trust. Otherwise it will start to ignore the last modified dates.
Crawl Errors Priority Metric includes Mixture of Signals
The priority metric for crawl errors in search console is a mixture of pages being returned in search results, included in Sitemaps, and if it has internal links. The higher the priority are the ones Google thinks might have content which Google wants to index.
Add Last Modified to Redirects in Sitemaps
When Redirecting URLs include them in a Sitemap with a last modified date set after the redirect was put in place, it will encourage them to be crawled more quickly
Last Modified In Sitemaps Aids Crawling
Google thinks the Last Modified date in an XML Sitemap can be very useful to help them recrawl URLs, and they also support RSS and Atom feeds.
Google Ignores Irrelevant Sitemap Content
Google will ignore any information in Sitemaps which it doesn’t recognise, so you can include additional information for other purposes.
No Good Solution for Reactivating Pages
If you have pages which expire but are reactivated after a period of time, there isn’t really a good solution, but you can use a Sitemap to tell Google about URLs which are now active, and use the unavailable-after meta tag.
Mobile Sites Don’t Need Sitemaps
Separate mobile sites should be canonicalising to the desktop page, so you don’t need to submit them to Google via a Sitemap, but it’s still worth adding to Search Console.
HTML sitemaps help indexing and crawling
If you have a complicated website, providing a mapping of your category pages can help Google to find pages and understand the structure of a website.
Cross Domain Sitemaps Will Be Crawled If Present in Robots.txt
Google will use Sitemaps hosted on an external domain if they are referenced in the robots.txt.