JavaScript SEO Office Hours Notes: March 25th 2020

Ruth Everett
Ruth Everett

On 30th March 2020 • 8 min read

Notes from the Martin Splitt’s JavaScript SEO Office Hours on March 25th 2020

 
 

As JS Developer What Are The Top 5 Things You Should Consider To Have/Not Have for Web Security?

Martin recommends checking for cross-site scripting and cross-site forgery, as well as implementing content security headers, as they prevent a lot of problems. Set the content security policy to only report feedback on problematic settings before you switch them off. Update your dependencies, reduce them to begin with and don’t include modules for one functionality that you can build in 5 lines.

 

What Is The Best Way to Prevent 3rd Party Font From Affecting Site Performance?

While Martin recommends not worrying too much about 3rd party fonts, you can cache these on your end to make sure that you are not downloading them over and over again. You can also subset a font to make sure you are only downloading the font you need, or render your site with the system/fallback font and then swap in the new font. He also recommends reducing weights in the font file, in this case, it may worthwhile looking into inlining or hosting your own version of the font. If you can’t do this for licensing purposes, at least use display swapping.

 

What Are The Best Practices to Follow When Optimising Speed on JavaScript Websites?

Defer as much JS as you can and also, if possible, do not rely entirely on client-side rendering for your most critical content. This is because the quicker you deliver HTML to the browser, the better, and the more you rely on JavaScript the slower it will be, especially for landing pages and static pages. In this case, you would want to consider server-side rendering or server-side rendering in hydration.

If you are to start a new project, it would be worth looking into higher level frameworks such as next.js, nuxt.js or Angular Universal. If you really have to stick with client side rendering, then at least try to make your JavaScript as fast as possible by basically tree-shaking your dependencies, so that you only have the absolutely crucial and necessary piece of code for your dependencies.

Consider bundling up as much code as possible, but also splitting your code along the packages where it makes sense. For example, you might need all the bundles for the application, so you could split those and pre-fetch the other bundles so that it is faster, as the browser does that all in the background.

As a general rule of thumb: the less JavaScript you ship the better.

 

For Caching Assets Such as Fonts Would You Recommend Using a Service Worker?

In short, yes and no. Yes, generally for the users, for whom it is fantastic to use a service worker to reduce the network activity, especially in the sense of reducing the latency, because it is always going to fetch from a local cache from the network.

However, search engines are also heavily and aggressively caching, which is why Google would suggest for you to use long-lived caching and fingerprinting, or some sort of versioning of your assets. Martin recommends reading the cache your first line of defense article on web.dev for more details on this.

 

There is No Such Thing As a Second Wave of Crawling

Second wave crawling is an over-simplification that is coming back to Google with interesting implications every now and then.

Basically, a sitemap helps Google to discover things quickly that they wouldn’t be able to otherwise. If you have a very well linked structure between different pages then the sitemap is not going to increase crawling speed, but having a sitemap definitely helps them to discover new content quicker.

Sitemaps are really just a different way for Google to consume content and discover links on your site.

 

When Fingerprinting Resource Files, Is It Better to Change The Name Or Can Parameters Work?

Parameters will work fine here. However, it might be tricky depending on your setup. For example, if you have a cache in between or a reverse proxy that somehow kills query parameters, as some CDNs tend to have, this might not work and is sometimes harder to debug. This is because you might not have the level of logging that you need to troubleshoot issues throughout multiple hosts and pieces of infrastructure.

Martin informed that generally it doesn’t make a huge difference, but having it in the file name is probably a little bit more robust across different environments, but generally speaking it is fine to use parameters.

 

Generally Speaking Rendering JavaScript Has Become a Lot Faster

Google is running analysis on the initial HTML, specifically to discover links (link discovery is still fastest if you are server-side rendered). But fundamentally, you can assume that every page gets rendered. You can assume that between initial crawling and then rendering as a median there is a 5-second delay in the queue, plus the time it actually takes to render your page.

 

Will The Rendering Process Stop If The Page Is Too Long?

Martin explained that generally Google would do this, but it is very hard to specifically say at which point because it depends on a bunch of heuristics and signals. He recommends not worrying about it too much. If it is fast for a user, you should not have a problem. But if, for example, you have to bind a user for 2 minutes to actually get any content, then you may run into some problems. However, this is not an SEO specific problem, so you shouldn’t worry about this for SEO.

 

How Would You Recommend Displaying 404 Codes On An SPA That You Are Routing Locally?

Typically there are 2 options for displaying 404 status codes – you can either noindex the page or redirect off to an actual 404 page. Both of these are as good a solution as each other. However, Martin informed that because Google only sees the initial HTML and then stops, having a robots noindex in the rendered page will not be seen. Martin recommends ensuring that these solutions are implemented correctly, as both can cause problems if they are not.

 

Be the First to Know About the Latest Insights From Google

Hangout notes

Loop me in!

Author

Ruth Everett
Ruth Everett

Ruth Everett is a Technical SEO Analyst at DeepCrawl. You'll most often find her helping clients improve their technical SEO, writing about all things SEO, and watching videos of dogs.

Get the knowledge and inspiration you need to build a profitable business - straight to your inbox.