React and SEO in eCommerce

September 24, 2018

by  Chris Carreck

As a digital agency, one of the pillars of our business is SEO. We're proud to partner with major e-commerce players on effective search engine optimization strategies and execution plans. It is our mission to produce rich and immersive experiences across web and mobile that ensures year-on-year inbound traffic growth.

React.JS and React-Native are generally our front-end libraries of choice, a challenge we face is how to integrate the latest technologies with proven SEO practices. We aim for all of our platforms to perform beyond expectations - from search bots to our end users.

For those that may not know, React sites aren't structured like a normal web page. If you have ever viewed the source of a ReactJS site, you may have realized the page is a bundle of JavaScript - as opposed to a combination of HTML and CSS. This approach has huge consequences for SEO experts and we're here to explain why.

React + SEO Impact

Unfortunately, some bots cannot fully render JavaScript. They may not see your site the same way the end user does, and as such, JavaScript affects the crawl ability of your site. Getting this right can be difficult, here is CLD's recipe for optimal value and SEO impact of our products.

Luckily for us, Google announced several years ago the ability to crawl JavaScript which has increased the chances that they can see and render the almost everything on a React site. As a digital agency, it's our job to ensure that crawlers, not just Google, can read everything on our sites. Further to this, we ensure testing tools such as Moz and Screaming Frog render our site correctly, so we can improve our SEO through their analytics.

The first step is extensive testing. Its widely thought that Google Bot is based on Chrome 41 and this is where our testing begins. Download Chrome V41, load the site, and use the developer tools to check for any JavaScript rendering errors. Any errors caught here are likely to block Google Bot from effectively parsing the JS. We consider polyfills and graceful degradation as reliable techniques at this phase of testing. We want to ensure top-of-the-line experiences across the latest and dated browsers alike. However, this tactic is not fool proof. Not all bots can render JavaScript, especially when dealing with some SEO monitoring tools we use, primarily Moz.

In order to get around these, we've used a number of techniques across different projects here and Creative Licence:

Using a Service such as Pre-render.io

Pre-render.io will scan the site, render the JS, and then, supply this fully rendered version to incoming bots. We have effectively used this on smaller scale sites to provide excellent search engine visibility for all major search engines and our primary SEO tools. This has been highly effective for us on small to mid-range sites.

Server-side Rendering

Another work around is server-side rendering. Isomorphic or Universal JavaScript, using a framework such as Next.JS allows us to render the HTML on the server. As with pre-render.io, server-side rendering allows us to serve fully rendered pages to incoming bots. This approach provides an added advantage of serving full rendered pages to a user while JS loads in the background, performing any DOM manipulations required. There is no waiting for the JS to load when a user hits the home page. This approach has worked well on more mid-to-larger-scale products. It is a complex implementation but offers fast client-side performance.

Snapshotting and Static Site Rendering

There are also frameworks such as React-snap and GatsbyJS that use different approaches to compile the site and output a "snapshot" - a compiled output of HTML and CSS. This approach is similar to services like pre-render.io, but you gain greater control of the process. Gatsby also allows server-side rendering, graphQL and a host of other functions. This approach is useful if you are looking to implement a serverless. JAMstack architecture.

So far, we've had success with each of these techniques described above. We've used universal JavaScript for larger projects, making sure we've decided on an implementation path right from the start. For our smaller sites, where we feel the overhead of planning SSR is not necessary, we've used a combination of snapshotting and services like pre-render.io

Be sure to follow us on Twitter @Cre8iveLicence, and if this kind of stuff interests you, then you'll love working with us).