Google, JavaScript and the AJAX Crawling Scheme

Tristan Pirouz
Tristan Pirouz

On 3rd November 2015 • 4 min read

In October 2015, Google announced that they no longer recommend using the AJAX crawling scheme that they first proposed in 2009 and that JavaScript sites should be fine for SEO now (providing your JavaScript and CSS files aren’t blocked).

The announcement came alongside significant changes to the Search Console Help Guide to AJAX crawling for webmasters and developers, and Google say this new process could also help other search engines crawl AJAX sites as well.

robot spider
 

What does this mean for SEOs?

Essentially it’s no longer necessary to use the escaped fragment solution to get hashbang URLs crawled by Google (and potentially other search engines) as Google can render and understand JavaScript content. Google will now index URLs with a #! instead, but still doesn’t index URLs with hashbangs on their own.

Sites using the escaped fragment solution will still be indexed and ranked as long as JavaScript or CSS files aren’t being blocked, but Google now recommends using progressive enhancement when it comes to redeveloping the site.

 

Getting JavaScript content indexed by Google

Although Google’s ability to crawl AJAX websites is a significant development, it doesn’t come without limitations, including:

 

Google might not render the JavaScript correctly

In a recent Google Webmaster Hangout, John Mueller mentioned that Google’s rendering of JavaScript HTML pages is ‘incomplete’, but they are always improving their capability for rendering JavaScript pages, to help Googlebot understand them better.
Pages need to be tested very carefully with the Fetch and Render tool in Search Console and you might need to experiment to understand which JavaScript elements are getting stuck.

John did admit that it can be very difficult to find a cause for an issue at the moment, so he suggests pre-rendering as a temporary solution until Googlebot is able to pick it up properly if you really can’t find the problem.

 

Make sure that Google can crawl all of your CSS and JS files

This now applies for both for desktop and mobile pages. The mobile requirement is to make sure Google can accept your page as mobile-friendly.

 

FETCH AND RENDER VIEWS MIGHT NOT BE CONSISTENT IF JS/CSS IS DISALLOWED

The Fetch and Render tool shows you two different renders, one for Googlebot, which uses the Googlebot user agent, and one for users, which uses a browser user agent. If JS/CSS is disallowed for Googlebot, it may not be able to render the content in the same way in both views.

 

JavaScript still not ideal where speed is important

Google is running JavaScript for most pages, but this sometimes happens at a later stage, so it might not be ideal for content where speed of publishing is critical, such as a news website.

 

Google’s official advice seems to be against pre-rendering for Google only

This is seemingly due to concerns that the pre-rendered content might differ from the version seen by users. However, John Mueller stated in a recent Google Webmaster Hangout that this is OK and Google can use it.

 

Further reading on JavaScript/AJAX and Google

Author

 

Tags

Get the knowledge and inspiration you need to build a profitable business - straight to your inbox.

Subscribe today