In order for web pages to be included within search results, they must be in Google’s index. Search engine indexing is a complex topic and dependent on a number of different factors. Our Hangout Notes on indexing cover a range of best practice advice to ensure your website’s important pages are indexed by search engines.

Index User-Generated Content if it Provides Value

December 11, 2018 Source

It is fine to index user-generated content, such as comments, but it is up to webmasters if they think that it is valuable content that should be visible to search engines. In the case of comments, it might make sense to block them from being crawled and indexed.

Crawling But Not indexing Pages is Normal for Pages with Content Already on Other Indexed Pages

November 30, 2018 Source

It’s normal for Google to crawl URLs, but not index them if they are not considered useful for search, such as index and archive pages which have content already indexed on other pages. This has been the case for a long time, but these pages have become more visible recently due to the ‘Crawled – currently not indexed’ report in Search Console.

It’s Normal for Google to Index XML Sitemap Files

November 27, 2018 Source

If you see an XML sitemap file showing in the search results when you search for a specific URL on your website, this is normal and won’t cause any issues. If you don’t want XML sitemaps to be indexed, then add an x-robots tag in the HTTP header.

Internally Link to Seasonal Content so Google Will Index It

November 16, 2018 Source

Publish seasonal content far enough in advance for Google to index it for the required period, and also internally link to this content so Google knows these pages are important and relevant to users which will improve indexing.

Discovered – Currently not indexed’ GSC Report Pages Have No Value for Crawling & Indexing

November 16, 2018 Source

Google knows about pages in the ‘Discovered – currently not indexed’ report in Google Search Console but hasn’t prioritised them for crawling and indexing. This is usually due to internal linking and content duplication issues.

It Will be Difficult For Google to Improve its Rendering Services at Scale

September 21, 2018 Source

There are plans for Google to improve its rendering services, but this will take a long time as it will be very difficult to make rendering improvements at scale, especially for websites with millions of pages that each require rendering.

Use Pre-rendering For New JavaScript Frameworks so Google Can Index Them

September 21, 2018 Source

Google uses Chrome 41 for rendering which is an old version, so it may struggle to render new frameworks such as JavaScript ES6. To get around this issue, use pre-rendering.

Ecommerce Websites Should Avoid Serving Product Page Content Via JavaScript

September 21, 2018 Source

Large ecommerce websites should steer clear of serving product page content via JavaScript as this can delay indexing of new products by weeks, as opposed to minutes when you serve content in the HTML.

For Mobile-first, Ranking Fluctuations Are Caused by Google Recrawling and Reprocessing a Site

September 21, 2018 Source

If a site experiences ranking fluctuations after being switched to mobile-first indexing, this is because Google will need to recrawl and reprocess the site to update the index.

Related Topics

Crawling Crawl Budget Crawl Errors Crawl Rate Disallow Sitemaps Last Modified Nofollow Noindex RSS Canonicalization Fetch and Render