JavaScript Rendering

Search engines treat JavaScript content on a website different to typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features possible, it is important to understand how search engines view this content and optimize for this. Our Hangout Notes cover best practice recommendations from Google, along with the latest advancements to their rendering engine.

Google Needs to Access JS Files & Server End Points Used For AJAX Requests

October 4, 2019 Source

If AJAX requests that are needed to download JavaScript on page load are being blocked in robots.txt, then Googlebot won’t be able to see or index any of the content that these requests will generate.

JavaScript SEO Will Evolve to be More About Debugging Issues When JS Works by Default

August 23, 2019 Source

Martin sees JavaScript SEO as evolving from working around the pitfalls with today’s technologies to knowing what can go wrong when JavaScript works out of the box and how to debug issues. Google can help by providing troubleshooting tools, but technical understanding will still be required.

JavaScript SEO Will Continue to be Necessary Due to Changing Frameworks, Complex Issues & Poor Implementations

August 23, 2019 Source

John and Martin believe that JavaScript SEO won’t go away as Google’s rendering capabilities improve because of the continual changes to frameworks and new elements in Chrome, poor technical implementations and the complexity of debugging issues.

Google is Rendering More Pages as Cheaper Resource-wise Than Running Rendering Heuristic

August 23, 2019 Source

Googlebot is putting increasingly more pages through the render phase, even if they don’t run JavaScript because it is cheaper resource-wise for Google to render the page than to run a complex heuristic to decide if it should be rendered.

Google Determines if Pages Need to be Rendered by Comparing Content Found in Initial HTML & Rendered DOM

August 23, 2019 Source

Google compares the content of the raw HTML of a page from the initial crawl to the rendered DOM after rendering to see if there is new content and to determine if it needs to be rendered going forward.

More or Less Every New Website is Rendered When Google Crawls it For the First Time

August 23, 2019 Source

Nearly every website goes through the two waves of indexing when Google sees it for the first time, meaning it isn’t indexed before it has been rendered.

Ensure Hidden Content is Set-up With CSS Rather Than JS if You Want the Content to be Indexed

June 14, 2019 Source

If you have hidden content which you want to be indexed, ensure it is implemented using CSS, rather than sever-side JavaScript, to enable Google to see the content has been loaded when they crawl and render the page.

An Updated User Agent is Expected to Reflect The New Modern Rendering Infrastructure

June 14, 2019 Source

Google has been experimenting with the current user agent settings and is re-thinking the set u. John expects some changes to be announced in the future around an updated user agent so that it reflects the new modern rendering infrastructure.

Look Into Server-side Rendering For Improved UX as Dynamic Rendering is a Temporary Workaround for Crawlers

June 11, 2019 Source

Dynamic rendering is a temporary workaround to allow search engines and social media crawlers to be able to access content even if they can’t render JavaScript. John foresees dynamic rendering being less useful in a few years as all crawlers get better at processing JavaScript, but look into server-side rendering for an improved experience for users.

Related Topics