top of page
Writer's pictureRobin Rozhon

SEO Stories of Two JavaScript-dependent Websites

I’ve heard on numerous occasions that JavaScript is evil, but I’m not sure if I agree with that.


Yes, JavaScript makes SEO more difficult, but that’s simply how the Internet has evolved. JavaScript provides great benefits to users and developers. It’s here to stay, and we need to learn how to deal with it.


This post is meant to show some examples of websites with JavaScript that I’ve encountered.


Introducing our JavaScript heroes

There are two heroes in this post.


  • White.com – All content is server-side rendered with some JavaScript features.

  • Grey.com – All content is client-side rendered via JavaScript (SPA).


You can use your Chrome browser to do quick testing because Google has upgraded its web rendering service (WRS) to Evergreen Googlebot, which is basically the latest version of Chromium (give or take a few weeks).


The other Google tools, such as the URL Inspection tool in the Google Search Console and the Mobile-Friendly tool, still use the old web rendering service powered by Chrome 41 (as of June 6, 2019).


The upgrade significantly enhanced the capabilities of WRS because it brought:


  • ES6 and newer JavaScript features

  • IntersectionObserver API (great for lazy-loading)


This means Google is able to more accurately replicate what the users are seeing on JavaScript-heavy websites than in the past, but it doesn’t solve all the problems.


Don’t rely on Google with JavaScript rendering

We’ve all seen this diagram John Mueller and Tom Greenaway showed us during Google I/O in 2018.


Diagram of Google’s two wave indexing (Source: Google)

Google doesn’t execute JavaScript immediately. Instead, it’s with a delay. The length of the delay is usually within a few days to a few weeks but can take even a couple of months.


If your website changes often, waiting for the second wave of indexing may not the best idea. Think of news websites or car dealerships. This delay may hurt websites that rely on fresh content being indexed immediately.


Additionally, only canonical tags available during the first wave of indexing are taken into consideration by Google. They may not be respected if you’re setting them via JavaScript.


Google has tremendously improved its JavaScript rendering capabilities, but other search engines and social media crawlers are still behind.


Do you want to risk your revenue while hoping Google can render your website properly and on time? I don’t! I want to make sure that search engines and other web crawlers don’t need to execute JavaScript to understand the website.


White.com

Chrome is fired up. JavaScript is disabled. Let’s go.


White.com in Chrome with disabled JS

This looks awful, but it was actually not that bad.


All the content is there because it’s a server-side rendered site. We can’t see the content because the opacity of the entire body tag is set to 0.


Opacity is a CSS property that adds transparency to elements.


Once all the critical resources (fonts, images, etc.) are loaded, the JavaScript changes the opacity to 1 and the entire content becomes visible.


It’s not ideal, but it’s not mission critical. The website still ranks very well and drives tons of traffic from search engines.


Grey.com

Client-side rendered SPA (Single-page application) always means fun.


Grey.com in Google Chrome with disabled JS

As you can see, I was extremely creative when coming up with the names of those two websites.


The solution used for this website to serve static HTML is called dynamic rendering.


Dynamic Rendering (Source: Google)

Dynamic rendering means switching between client-side rendered and prerendered content for specific user agents.


Prerender.io, Puppeteer and Rendertron are three popular rendering options. All of them use an instance of headless Chrome.


Popular services for rendering

Ensuring that static HTML is served only to web crawlers is done via User Agent detection.


User agent is a string that identifies who makes the request for the page.


You want to detect all the search engines that may drive traffic to your website, such as Google, Bing, and DuckDuckGo. If you care about international markets, throw Yandex, Baidu, Seznam, and others into the mix.


Don’t forget about social media crawlers from Facebook, Twitter, LinkedIn and others because they struggle with JavaScript, too.


You’re not done yet

The user agent detection is done. The prerendering solution is up and running. You’re all set to receive phenomenal visibility in search results. Aren’t you?


Dynamic rendering is tricky because it’s hard to notice rendering problems when the site is fully functional to users.


How do you notice when something is wrong with the rendering process? A decrease in organic traffic? A drop in rankings?


That’s already too late and the business is losing money.


Before you label the implementation of dynamic rendering as finished, set automated monitoring of the rendering process.

Testing vs. monitoring

  • Testing is checking code updates before deploying to the production site. We could talk about unit tests, integration tests, and end-to-end tests for quite some time, but that’s not the goal of this post (if you want to know more, here is a great deck about Software Testing for SEO from Mike King). I just mention that testing needs to be baked into your development process, which may be challenging to do, especially if you’re an external agency.

  • Monitoring is checking the production site on a regular basis (i.e. daily). You can use third-party tools or write your own script and start monitoring your site today.


Automated monitoring tips

  • Request URLs for each page type/template.

  • Ensure that JavaScript is disabled.

  • Set the user agent to Googlebot Smartphone.

  • Verify the values, and not only if an element is present.


Verify SEO elements against predefined rules. Don’t check only if an element is present in the code. Check if it includes the right value. It’s useless to know that the meta robots tag is in the code unless you know if the value is “noindex” or “index.”

Rendering success rate

Monitoring of grey.com over time helped us discover problems with the rendering process we wouldn’t have discovered otherwise because the site worked perfectly fine for the users.


The rendering success rate shows the percentage of pages that were rendered correctly.

The higher the number, the happier I am.


Rendering success rate = Number of the successfully rendered pages / Number of the tested pages

Initially, the rendering success rate was only 62.7%. This means we returned static HTML to Googlebot for only 6 pages out of 10.


The other 4 times, we returned this…


Grey.com if rendering fails

There was absolutely no content. It was just this empty layout, and that wasn’t good at all.


We did some analyses and brainstorming with developers, and merged log files with rendering data in Power BI.


Analyzing the rendering success rate in Power BI

Extending the JavaScript timeout provided a solid improvement from a 62.7% to 73.6% success rate, but that was not enough.


We got it right after allowing the site to load with the cookies and storage turned off. The rendering success rate jumped to 99.6%. It’s not perfect, but it’s a number I’m very happy about.


The rendering success rate over time

Key takeaways

  • Verify that content which is supposed to be visible is visible.

  • Detect not only search engines but also social media crawlers.

  • Set up daily monitoring. It’s very useful even for static websites.

  • Watch the rendering success rate over time.


I talked about dynamic rendering a lot in this post, but keep in mind it’s a workaround. If you’re building a new site, use universal JavaScript whenever possible.

bottom of page