JavaScript Rendering

Search engines treat JavaScript content on a website different to typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features possible, it is important to understand how search engines view this content and optimize for this. Our Hangout Notes cover best practice recommendations from Google, along with the latest advancements to their rendering engine.

Do Not Rely on 3rd Party Cookies to Render Content

March 17, 2020 Source

Because Chrome is going to block 3rd party cookies, and Google uses Chrome to render pages, if your site is dependent on third party cookies to render a page’s content then it won’t be seen by Google.

Onclick Load More JavaScript Links Are Not Triggered During Rendering

February 21, 2020 Source

Google doesn’t trigger load more JavaScript links during rendering, but they use frame expansion to render the pages with an extremely long viewport to see if the page expands the viewport automatically.

Different Rendering Processes are Used When Rendering a Page For Indexing & for Users

February 7, 2020 Source

Googlebot doesn’t have a specific time when it takes the rendered DOM snapshot used for indexing. The main reason is due to the way Google renders pages, as there are different processes when rendering for indexing compared to when users access a page. This can result in elements on the site being processed differently and it may take longer to render the page for indexing purposes.

JavaScript Redirects Take Slightly Longer For Google to Process Than 301 Redirects

January 22, 2020 Source

JavaScript redirects take longer than 301 redirects for Google to understand, as the JavaScript needs to be processed first.

Avoid Providing Google with Conflicting Canonical Tags When Working on JavaScript Sites

January 10, 2020 Source

If you have a JavaScript site, John recommends making sure that the static HTML page you deliver doesn’t have a canonical tag on it. Instead use JavaScript to add it, in order to avoid providing Google with different information. Google is able to pick the canonical up after rendering the page in order to process and use it.

Use Chrome DevTools and Google Testing Tools to Review a Page’s Shadow DOM

December 10, 2019 Source

There are two ways to inspect a page’s shadow DOM in order to compare it to what Googlebot sees. The easiest way is by using the Chrome DevTools, within the inspector you will see # shadow route which you can expand, this will display what the shadow DOM contains. You can also use any of the testing tools and review the rendered DOM, this should contain what was originally in the shadow DOM.

When Changing Frameworks on a Site Ensure You Incrementally Test to Reduce SEO Impact

November 29, 2019 Source

When moving a site from HTML to a JavaScript framework, John recommends setting up test pages and using the Google testing tools to ensure that everything on these pages are indexable. Once you have tested these elements, John then suggests taking certain, high traffic, pages on your site, converting them to the new framework and reviewing the effect from changing these pages. It’s best to do this over a period of around a month to ensure there is time for fluctuations to settle down.

Google Will Not Render JavaScript Content if The Page Returns a Redirect or Error Code

November 12, 2019 Source

If you have a page which contains JavaScript content but it returns a redirect or an error code, Google will not spend time rendering the content. For example, if you use JavaScript on a 404 page to display an error message or links. With redirects, Google does not need to render the content in order to follow the redirect to the new page.

Google Needs to Access JS Files & Server End Points Used For AJAX Requests

October 4, 2019 Source

If AJAX requests that are needed to download JavaScript on page load are being blocked in robots.txt, then Googlebot won’t be able to see or index any of the content that these requests will generate.

Related Topics