React SEO Best Practices

The difference between this article and other React SEO articles and blogs is that we actually have experience ranking, obtaining organic web traffic to React JS applications and webpages, and have been focusing on javascript Search Engine Optimization for the past few years. 

Here is the deal, Google and other search engines have been crawling pages with meta tags, CSS, HTML technology and standard server-side rendering web apps and/or static HTML since the dawn of search engines. 

At least out of the box, React ui is client-side rendering, while possible, Google bots and other search engines have to use additional resources and api calls to crawl & figure out what your website or web application is all about.

Can you name one process in technology where adding extra steps & devoting more resources is a good thing? Generally more steps is bad and a waste of resources.

Our SEO motto, make it easier for search engines, not harder.

SEO Consultant

Free Weekly SEO Newsletter

Hey! I’m Scott, I’ve been an SEO consultant for nearly 10 years. Get on my weekly newsletter to get..

  • Weekly SEO updates, keep up with the latest and greatest.
  • How we grow websites from zero to millions of visitors
  • A bunch of SEO tips and Tricks you won’t find anywhere else test

What does this mean for your React Web App?

In short, all SPAs (Single Page Applications,) including react, are going to have a few extra challenges and extra steps that you have to take in order, in our opinion, to get your web app ranking like a ‘normal website.’

It’s not a good look for a company if an open source WordPress website with a free theme and stock image files is kicking your apps butt in the search engines.

Once your app architecture is on the same level so to speak as standard web apps (ie PHP, HTML, Ruby, etc,) more often than not, your rankings will greatly improve. 

What is our beef with other React articles?

Mostly, we believe they undersell the problems that will be faced in ranking an application in the search engines.   Pretty much…. Most likely haven’t worked on a react helmet or even a react website.

They point out very simple things and best practices, like using cached images or say things like install “sitemaps, meta data, title tags, etc” as if it’s going to make google bot and crawlers insta love your React app. 

Sure, these steps and practices are important, but we’ve had more than one developer come to us in shock that this alone didn’t solve the problem. 

In our experience at least, once you solve client-side rendering issues and challenges, a lot of problems that single-page applications have will be resolved.

Back to our motto, make it easier for search engines and web crawlers, not harder.

How Does Google Currently Index Javascript?

To be clear, Googlebot can index Javascript, however this doesn’t mean it isn’t without its problems. 

This photo from Google Developers breaks it down pretty well:

While it does work, in our experience, here is the downside:

  • Your content doesn’t get updated as quickly, meaning if you make an important change (ie sale for a holiday, important SEO updates, etc) Google will need more time to process. SEO takes long enough as it is, adding more time isn’t ideal. 
  • It’s been reported that Google disables its rendering abilities during algorithm updates.  Meaning, multiple times per year you may be left without the ability to update url content. 

Javascript and any library that uses it can be a bit of a black box, sometimes you just really know what’s going on.  Whereas with standard HTML or SSR, it’s very clear and obvious how Googlebot and other search engines are treating your website or web app.

What should be your goal when making a React App Search Engine Friendly?

Googlebot and other search engines want to consume meta tags and HTML as quickly as possible, in effect allowing Google to utilize its resources efficiently. 

Yes, Google’s resources are finite, and their allotted crawl budget on any given day may not be sufficient to consume a poorly built React app with a bloated library or a misconfigured helmet. 

When defining crawl budget, Google breaks up the important factors into two main categories: Crawl rate limit and Crawl demand:

  • Crawl rate limit: In essence, this is the number of parallel connections Googlebot can use to crawl your site. The rate limit depends on crawl health (how easily your site can be crawled), and whether an optional limit is set in the search console.
  • Crawl demand: This factor depends on popularity and staleness. If your site is popular, Googlebot will be more active in indexing your site. If your site becomes stale, Googlebot is more likely to lessen or eliminate frequency of indexing.

The primary goal when building a React app and creating a React helmet is to make Googlebot happy while preserving the UX enhancements achieved by developing websites with React.

It’s important to note that routine SEO concepts such as accurate meta tags within the HTML <head> still apply, as well as any other html tag, which can be easily managed by utilizing the react-helmet library.

What do These Challenges Mean for React js?

In other words, come as close to a 1:1 ratio between your React app and a server side web application as possible.

It’s important to serve your content to Googlebot quickly. Crawlers tend to be impatient, and depending on Google’s current resource overhead, crawlers can overlook your page’s content if it doesn’t load fast enough. 

You need HTML rendering to happen quickly in order to better your overall SEO metrics and rank highly on Google and other search engines.

Approaches for Search Engine Optimization

In order to feed Google’s crawlers in a timely manner, React developers have a few options at their disposal:

  1. Leverage SSR (Serverside Rendering) or SSG (Static Site Generations): Leverage React in creative ways by using isomorphic features baked into the React framework to build hybrid implementations such as SSR and SSG. This means programmatically informing React which HTML to render on the server before shipping out the response. 
  2. SEO Friendly React Frameworks: Use frameworks like GatsbyJs or NextJs to streamline the process of implementing isomorphic React techniques.
  3. Prerendering: Feed Googlebot HTML snapshots by leveraging the react-snapshot library, or use prerendering services such as or renderron that handle the heavy lifting of implementing a snapshot-based architecture.
  4. App Optimization: Optimize the way your React app loads with native features like lazy and suspense.

A more precise definition of our goal could be stated thus: 

Create a React app that has the ability to quickly achieve first contentful paint (FCP), allowing Googlebot crawlers to readily consume your site’s content while preserving the ability to utilize the robust tooling React brings to the table.

In layman’s terms, how quickly can your React app deliver standard HTML to crawlers?

SPAs and their SEO Effectiveness

Throughout the last decade, we’ve witnessed the popularity of Javascript explode. 

Utility libraries like JQuery were substituted for full blown front-end frameworks like ReactJs, often moving HTML rendering from the server to the client browser. 

The emergence of NodeJs and npm has brought server-side Javascript to the mainstream, making the language a truly ubiquitous phenomenon.

Front-end Javascript frameworks gave rise to the single page application (SPA), allowing for highly responsive and interactive sites that perform more like native applications than clunky old websites. This evolution was significantly influenced by platforms like Facebook, which demonstrated the power of SPAs in handling vast amounts of data seamlessly. Looking at stackoverflow’s developer surveys from 2019 onwards, it is clear that React is here to stay. 

Unfortunately, the primary goal of most websites is to be seen, and Google crawlers didn’t respond as well to SPAs as the rest of us humans. 

Are SPAs good for SEO?

SPAs in their most simple form are not great and far from the best for SEO.  As an aside, we have an entire article about Single Page Apps and how to debug any issues.

Due to the nature of how crawlers currently work, the fact remains that the more content you can load at a fast rate, the better your site will rank for SEO purposes. 

A pure React SPA template loads an empty shell at first paint, only beginning to load content after the Javascript bundles have fully loaded and began populating the DOM.

The end-users are typically happy with the results, but crawlers need content to crawl on, and all that Javascript was holding things up.

Isomorphic React | What is it and How Does it Help with SEO?

Developers who are new to the React framework tend to minimize the full scope of what React is capable of. 

React is known for tooling that promotes feature rich SPAs, allowing for state management and many other cool features to take place on the client. However, React doesn’t limit this functionality to the client alone. Isomorphic React can enhance SEO performance by delivering crawlable HTML and easily indexed content.

Isomorphic React is just a fancy way to say that React can do similar things on both the client and the server. For instance, when a React site is hosted on a Node server, React offers libraries that can be imported into your server code that transpiles your JSX into pure HTML, which can then be sent to the client. 

This is what is meant by server-side rendering (SSR); the React application itself is converted from react components (JSX) to static content (HTML) before the response is ever sent back to the client. 

How SSR Helps Solve the React SEO Problem

One popular implementation of isomorphic React is server-side rendering (SSR). 

The concept of SSR  has been around since the early days of web development. Early adopters such as Classic ASP were using vbScript or other server-side language to render HTML on the server. React offers this capability as well, adding much needed flexibility when optimizing SEO for a React app.

A React module called `ReactDOMServer` is where most of the magic happens, offering a method called `renderToString()` that will accept any React element as a parameter, and return a static HTML string. Node’s native fs module can then be leveraged to create and store an HTML file based on this string, which can then be served out on a static file server built within the same Node instance.

Here’s a basic example of what this looks like:

module.exports = function render({ App }) {
  let ssrContent = ReactDOMServer.renderToString(<App />);
  return ssrContent;

As you can see, we passed the App component to renderToTring() in order to transpile our JSX into pure HTML, then return this from our render function. We now have static HTML to send from the server, eliminating some of our SEO woes by offering Googlebot quick content to crawl. This example is simplistic and somewhat contrived, but it helps illustrate how we can achieve performant SEO with the React framework.

This is the foundation upon which server-side rendering frameworks are built. Frameworks like NextJs and GatsbyJs each use their own flavor of isomorphism to maximize SEO performance to the best of their abilities.

Static Site Generation | Limitations and Strengths of SSG Apps for React

Static site generation (SSG) delivers incredible SEO performance, and no conversation about React SEO is complete without going over this tech. The concept of SSG is fairly straightforward: Convert each React page component into an HTML file at build time, which can then be hosted from a static file server. 

We’ve gone over how much Googlebot loves to crawl static and quickly loaded HTML, which is why SSG sites that are built with care typically outperform other React methodologies.

To utilize React SSG, a programmatic approach similar to our SSR example can be used. The main difference is that rather than sending the raw HTML string back to the client, an HTML file is actually created and stored on the server for hosting. The basic code usually looks something like this:

const htmlOutput = ReactDOM.renderToString(someComponent());
await fs.writeFile(fileURLToPath('/path/to/file'), htmlOutput);

However, all that performance comes with some limitations. 

When using SSG, data injection is limited to either the build runtime or the client runtime. Meaning, if you have a highly complex data-driven site, you won’t be able to leverage SSR to help balance the I/O overhead.

Why not just fetch all that dynamic data on the client? Well, that can work for small to middle sized applications, but when things start scaling up, SEO performance and user experience will take a hit.

The entire point of SSG is to minimize, if not outright eliminate, HTML rendering on the client. More AJAX requests means more HTML that must be rendered via Javascript in the browser, which is what we’re trying to avoid in the first place.

 A few AJAX requests here and there to spruce things up isn’t going to hurt, but going crazy with it is no good for maintaining those performant SEO metrics.

Pre-rending: What is it and how can it help your Web App?

Prerendering is a neat feature that has begun to mature as a reliable tech in recent years. 

Although there are some React libraries available (i.e react-snapshot) that allow developers to implement their own version of prerendering, they haven’t gained much traction yet. However, services like have been gaining in popularity, and they offer solutions that can significantly increase a React app’s SEO metrics.

If you’re signed up with, they will regularly scrape your website using the latest release of Chrome, and store the fully rendered page in a database.

 The frequency at which they will scrape your site depends on what plan you’re subscribed to, which affects how up to date their copy is of your site at any given time.

Ideal For Existing React Applications

The service is built as an intermediary software that can be bootstrapped into your pre-existing server code. It uses what is called a user-agent to determine whether a request is coming from a crawler or not. 

If it’s a crawler, the software will automatically redirect the request to the server database that contains the rendered copy of your page. If it’s not a crawler, the server responds with your normal React app. 

Prerendering has many of the same limitations as SSG, and can actually have more in some cases. 

So, when is prerendering a suitable option? Many apps were built with React templates such as create-react-app, which loads an empty shell at first paint and isn’t an optimal solution for SEO purposes. Rather than recreating a page like this using SSR or SSG frameworks, or some other custom isomorphic implementation, it’s often convenient and effective to leave the page as is and integrate the intermediary software.

The Cost Of Pre-Prendeing 

In this respect, is a cost effective solution that can help companies improve SEO while not breaking their budget. At the time of writing, offers a free plan that can handle caching up to 250 pages, with a cache refresh every 3 days. 

The next plan up is $99/month, offering 100,000 pages and a cache refresh once a day if desired. Not bad when compared with the potential cost of redesigning the entire site!

Alternatives to 

There are however open source alternatives worth investigating. For example Rendertron & Puppeteer are both open source.   We have an entire article on prerender vs rendertron you can check out.

React & Javascript Frameworks that are SEO Friendly

Using React by itself to create sites with performant SEO can be a daunting task, and it’s usually a challenge undertaken only by teams with highly skilled developers. 

More commonly, a decision is made to use a React framework that fits the needs of a particular project. Here’s a list of the two most popular React frameworks that can help streamline the process of optimizing SEO performance:

  1. GatsbyJs – A static site generator with an active developer community and rich library of plugins.
  2. NextJs – An isomorphic React framework with the ability to act as a static site generator, a server side renderer, or both.
  3. Not sure which one to use? Read our break down of Gatsby.Js vs Next.Js

Other React alternatives also offer their own solutions for related reasons, such as VueJs and Nuxt. 

If this strikes your fancy, here is our complete list of javascript frameworks for SEO.

Optimizing React Apps | The Way of Lazy and Suspense

Last and certainly not least, here are some optimization tips from our devs:

Faster first contentful paint equates to better overall user experience and SEO crawlability, and in many cases we can significantly decrease FCP time by using fairly simple optimization features built into React. 

Before moving on to more extreme measures, a wise developer will consider whether the native `lazy` function and `suspense` component can do the trick.

The `lazy` function accepts a callback, which returns a component of your choice that you would like to be loaded after all other ‘non-lazy’ components are finished rendering. An instant way to leverage this piece of React tech in most apps is to make the main content area lazy, while leaving the navigation menu and footer alone. 

A simple implementation may look something like this:

import React, { Suspense, lazy } from 'react';
import Navbar from './Navbar';
import Footer from './Footer';

 const MainContent = lazy(() => import('./MainContent'));

 const App = () => {
            <Suspense fallback = {<div>Content is loading...</div>}>
 export default App;

In this example, the main content will be loaded after navbar and footer, feeding crawlers (and the client) some content in a timely manner. The text “Content is loading…” will be displayed until React is done rendering the non-lazy components.

When implemented properly, this feature can work wonders. It’s not uncommon to see FCP decrease by two seconds or more after tailoring the code for a specific use-case. However, in more complex scenarios, lazy and suspense can become unwieldy and cause more trouble than good. 

For apps whose complexity calls for a more robust solution, developers can opt for the more flexible and robust isomorphic React development architecture.

React URLs | How React Router Handles Site Navigation

When using React to build SPAs, there is only one page sent to the client. From that point on all routing, linking, and state management happen in the browser. This is one reason why SPAs can offer such a seamless user experience, allowing for very smooth page transitions. 

Page-flicker, or the quick “flicker” a user sees when they click an internal link to a new page within the site, can be eliminated when using a React SPA. 

That’s because when a user clicks a navigation link in a React SPA, a new request is not made to the server. Instead, React handles the click, and loads the DOM programmatically with whatever components are associated with the link route. 

For example, the route “/aboutus” wouldn’t initiate a new HTTP request to the server. Instead, React would locate the “aboutus” component, render the component’s JSX into HTML, and load it into the DOM. This is excellent for user-experience, but as we’ve seen it can lead to some serious issues with SEO performance.

React SPAs and Performant SEO | Quick Guide

Effectively If you have already built or are planning to build a search engine facing you have the following options:

SSR & Static Site generation is the Way to Go

As previously pointed out, search engines like Google can crawl, render & index Javascript. However, it takes a lot of additional resources to make this happen. 

In addition, other search engines are just starting to dip their toe in to dynamically rendering javascript, meaning you are leaving traffic from search engines outside of Google on the table.

For example, Bing visitors tend to be high converting, although they have significantly less visitors than google, it’s worth-while traffic.

Search Engines have SSR and regular old HTML pretty figured out, there will be fewer questions about your SEO performance. 

Use An Existing Framework that leverages SSR (Serverside Rendering) or SSG (Static Site generation) this will likely eliminate most of your SEO problems.

Frameworks like Gastby and Next.js already leverage these two different types of technology.  If you are building an app from scratch, they are a worthy option and will likely eliminate a lot of headaches.

You already built a React App? Test Prerendering

If you’ve already invested quite a lot of resources into creating an app and are now realizing that it’s not the most search engine friendly, pre-rendering is an option worth testing. 

In our opinion, it’s more of a patch, rather than a solution that you should use for a “brand new application.”  Our experience thus far, is that the above two options are more ideal and produce a better result.

Meaning, if you’ve already built a Javascript application, it’s worth checking out.

Isomorphic React for More Advance Development Teams

Although it might not be worth the trouble, as you can leverage existing solutions, Isomorphic React is a possible option. This is because you’d be leveraging Node.js to handle much of your rendering on the server side, as opposed to on the client.

Our option, the key point to take home for the reader is that frameworks such as GatsbyJs and Next are simply proprietary implementations of isomorphic React. There’s nothing stopping developers from creating their own custom isomorphic implementation (other than the learning curve, of course).

This option is probably best reserved for more experienced developers. 

React and SEO | Overview and Concluding Thoughts

React SPAs are an exciting development in the world of web design. As we’ve seen, SPA’s offer flicker-free websites that perform similar to native applications, allowing for rich user experience and endless design implementations for creative designers to utilize. 

However, Google (and especially other search engines) haven’t quite caught up with this innovative approach, leaving many pages with content that rarely, if ever, gets crawled and indexed. For this reason, developers have come up with many creative solutions to give crawlers content fast, while preserving the comprehensive tooling including in the React framework.

Overall, React SPAs are a great technology that’s certainly worth considering for your next project. 

According to statista, React is the most used web framework among developers as of 2021. With that kind of popularity, React isn’t leaving the scene anytime soon.

More stuff:

  1. Benefits This section could discuss the advantages of using React for SEO. It might include points like improved user experience, faster loading times, and the potential for better rankings on search engines due to the efficient rendering of JavaScript.
  2. Best Practices – This section could provide tips and guidelines for optimizing a React application for search engines. It could cover topics like server-side rendering, using meta tags, and ensuring content is accessible to search engine crawlers.
  3. Troubleshooting – This section could discuss common issues that developers might encounter when optimizing a React application for SEO, and provide solutions or workarounds for these problems.
  4. Accessibility: This section could discuss the importance of making a React application accessible, not just for users with disabilities, but also for search engine crawlers. It could provide tips for improving accessibility, such as using semantic HTML and ensuring all interactive elements are keyboard-accessible.
  5. Analytics: This section could discuss how to integrate and use analytics tools in a React application to track user behavior and website performance, which can provide valuable insights for SEO.
  6. Security: This section could discuss the importance of security in SEO. It could cover topics like HTTPS, secure coding practices, and how security issues can impact a website’s search engine rankings.


oh my crawl logo
Digital Architecture
For Search Engines
Javascript Frameworks
Most Js Frameworks..
Brooklyn, NY 11219