JavaScript Rendering & SEO

Search engines treat JavaScript content on a website differently from typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features it makes possible, it is important to understand how search engines view this content and optimize for this. Check out our “Hamburger Analogy” an introduction to client-side vs. server-side JavaScript rendering.

Our SEO Office Hours notes and video clips compiled here cover JavaScript-related SEO best practices from Google, along with notes on the latest advancements to their rendering engine.

What is the difference between JavaScript and HTTP redirects?

June 22, 2022 Source

John explained that, in general, Google strongly prefers server-side redirects (301 or 302 redirects, for example) to JavaScript redirects. 

If you use JavaScript to generate the redirect, Google first has to render the Javascript and see what it does, and then see the redirect and follow it. If you can’t do a server-side redirect, you can still use JavaScript, but it just takes longer for Google to process. Using a meta-refresh redirect is another option but again, this will take longer as it needs to be figured out by Google.


APIs & Crawl Budget: Don’t block API requests if they load important content

June 22, 2022 Source

An attendee asked whether a website should disallow subdomains that are sending API requests, as they seemed to be taking up a lot of crawl budget. They also asked how API endpoints are discovered or used by Google.

John first clarified that API endpoints are normally used by JavaScript on a website. When Google renders the page, it will try to load the content served by the API and use it for rendering the page. It might be hard for Google to cache the API results, depending on your API and JavaScript set-up — which means Google may crawl a lot of the API requests to get a rendered version of your page for indexing. 

You could help avoid crawl budget issues here by making sure the API results are cached well and don’t contain timestamps in the URL. If you don’t care about the content being returned to Google, you could block the API subdomains from being crawled, but you should test this out first to make sure it doesn’t stop critical content from being rendered. 

John suggested making a test page that doesn’t crawl the API, or uses a broken URL for it,  and see how the page renders in the browser (and for Google).


Are web components and Javascript-only content bad for SEO? (Testing is key!)

May 13, 2022 Source

One user asked whether web components are bad from an SEO perspective. Most web components are implemented in Javascript frameworks and Google can process most forms of Javascript. John also mentions later in the video that sites that aren’t user-friendly if their JS were to be switched off typically aren’t a problem for Googlebot (as long as the relevant links and content are also available within the source code). However, John would always recommend testing a sample of pages using the URL Inspect tool before assuming your chosen JS frameworks are supported.


Do Not Rely on 3rd Party Cookies to Render Content

March 17, 2020 Source

Because Chrome is going to block 3rd party cookies, and Google uses Chrome to render pages, if your site is dependent on third party cookies to render a page’s content then it won’t be seen by Google.


Onclick Load More JavaScript Links Are Not Triggered During Rendering

February 21, 2020 Source

Google doesn’t trigger load more JavaScript links during rendering, but they use frame expansion to render the pages with an extremely long viewport to see if the page expands the viewport automatically.


Different Rendering Processes are Used When Rendering a Page For Indexing & for Users

February 7, 2020 Source

Googlebot doesn’t have a specific time when it takes the rendered DOM snapshot used for indexing. The main reason is due to the way Google renders pages, as there are different processes when rendering for indexing compared to when users access a page. This can result in elements on the site being processed differently and it may take longer to render the page for indexing purposes.


JavaScript Redirects Take Slightly Longer For Google to Process Than 301 Redirects

January 22, 2020 Source

JavaScript redirects take longer than 301 redirects for Google to understand, as the JavaScript needs to be processed first.


Avoid Providing Google with Conflicting Canonical Tags When Working on JavaScript Sites

January 10, 2020 Source

If you have a JavaScript site, John recommends making sure that the static HTML page you deliver doesn’t have a canonical tag on it. Instead use JavaScript to add it, in order to avoid providing Google with different information. Google is able to pick the canonical up after rendering the page in order to process and use it.


Use Chrome DevTools and Google Testing Tools to Review a Page’s Shadow DOM

December 10, 2019 Source

There are two ways to inspect a page’s shadow DOM in order to compare it to what Googlebot sees. The easiest way is by using the Chrome DevTools, within the inspector you will see # shadow route which you can expand, this will display what the shadow DOM contains. You can also use any of the testing tools and review the rendered DOM, this should contain what was originally in the shadow DOM.


Related Topics

AJAX Caching CSS PWA