Google Crawler Javascript: Boost your Search Engine SEO Efforts In & Out of the DOM

Oh Jeez….. You built (or are planning) a Singe Page Application, React for SEO or some other fancy pants Javascript framework and you need to know how to boost your over all SEO efforts. Now you need to know how the Google crawler Javascript relationship works.

Good news friend, you are in the right place. In this article we are going to teach you how to identify and diagnose the most common Javascript SEO issues we come across.

For those of us in a rush, here are some Javascript crawling quick links:

Will A Search Engine Rank Your Javascript Website?

Quick Update: After reading a few articles kicking around the web, my conclusion is that the majority of authors on this subject have never actually ranked or gained organic traffic with Javascript based content delivery system or CMS.

Many of the other articles on this subject answer very general questions:

  • Will Google index my javascript website?
  • Will a Search Engine respect a JS based title tag or meta data?
  • How content? Will the Google’s be able to index a h1, <p> or any other type of information based HTML tag?

The answer is yes and there are plenty of case studies to prove it.

However, the question basically no one answers.

Will Google (or any search bot for that matter) prioritize and rank your javascript website?

This is the question everyone should be asking, who cares if you index your site if no one visits it.

Does Google Execute Javascript in the DOM?

Yes they do, however it’s not always the best approach to rely on rendering content in the DOM.

Its a multi step process, here is a screenshot of Google Crawler Dynamic Rendering Example, with some of our own editing:

Can you name a single programming example where it is ideal to add extra steps? When you are relying on Google Bot to render Javascript, this is what you are doing.

Does the google crawler update javascript?

Yup, however according to our testing its not as frequent as serverside rendering. It’s also been reported that during major updates that Google turns off their Javascript rendering altogether.

Meaning your marketing team could be making important updates to content & Google simply ignores them for a month or two until javascript rendering is back online.

Use Google Chrome for Troubleshooting Javascript

Believe it or not, Google Chrome is one of the best tools for troubleshooting and identifying Javascript Crawler issues.

You have to right click, select network, then change the user-agent to Googlebot. Remember to reload the page when these settings are done.

Here is the settings and what you have to look for:

When you enable your site to crawl as Googlebot, you can clearly see how the page is being rendered after being crawled by Google:

Without GoogleBot Enabled

With GoogleBot Enabled

Why is this javascript crawling issue?

During our Javascript SEO Audits we noticed many developers and digital marketing teams look at this and say “hey, if xxx huge website is doing it, why can’t we?”

Google (kinda) Cannot Crawl The Inner Pages Of the Menu

Given the size of the website in the photo above, we can’t say ‘why’ they choose to make this technical choices for Search Engines. After all they have millions if not billions of pages.

However, for the average website this is less than idea. Google cannot crawl all the extra menu options, simply because they are not in the code that is being served when Googlebot crawls the website.

In the past, we’ve had developers tank organic search from changing the menu in this matter.

This is just one example, we strongly advise reviewing the key pages on your javascript app to see if the content full renders as expected.

Google Has a Hard Time Transferring Search Engine Authority with Javascript

According to our SEO testing, if Page 1 links to Page 2, Page 2 will not benefit as much. Whereas with serverside rendering, page 2 benefits dramatically more.

SEOs refer to this as “link juice.” It’s a rather tacky term that means authoritative term transfer from one page to another.

Like all technology, this can change in the future. For now, you are better off rendering your important content zones server side. Even with Javascript, ie node.js type frameworks.

You do not have an Authoritative Website

As a disclaimer, we do not know the overall search engine strategy large scale websites such as eBay are using. For example, it might be beneficial for them to limit their menu item links for reasons we didn’t take into account.

That said, you are not comparing apples to apples, eBay is a super authoritative website. At the time we wrote this article, they have a aHrefs Domain Rating of 93, where as our average client ranges anywhere from DR10-DR70 (we work on a few above that, but you get the idea.)

What we are saying is, due to years and years of backlinking, eBay can get away with stuff the average Javascript based website simply cannot. In many situations, they may be relying on authority, not highly optimized pages to rank in search engines.

What we often see if Startup’s copying large scale websites. All because the big boys (and gals) are doing it, doesn’t mean you can too. Simply put, all because an enterprise level website is under going a certain javascript search engine strategy, doesn’t mean it is ideal for you too.

Are There any other ways to render as Google?

Yes. You can use Search Console, generally we’ll use both the method above with Google Chrome and Search Console.

Follow the steps below in Search console, Enter the url, click live test and ‘view tested page.’


Search Console will also give you a number of ideas, for example if their are any other errors, likely they will show up here.

What Javascript frameworks does this apply to?

Frankly, any javascript network that renders in the DOM.

Including, but not limited to:

  • jQuery
  • React JS
  • Vue JS
  • Angularjs
  • Ember
  • Any framework that renders in the DOM

Should you avoid these frameworks altogether?

No, that would be ridiculous and not realistic in modern web development.

Rather they should be used strategically. Anything search engine facing should ideally render server side. For example Node or other frameworks such as Next Js or Gatsby.

In the case of react, Vue, etc, if there is no server side rendering alternative, you them for marketing elements that do not directly impact search engine traffic. For example, we have a client who uses a ‘follow you around’ style bar on the top of their website that feeds directly in to the CRM.

In this scenario, the questionable Javascript isn’t rendering content of high ranking importance, rather it takes a backseat and focuses only on marketing.

What should be your goal with front end focused Javascript for Crawling?

Generally we recommend trying to create a 1:1 ratio with HTML. Meaning, any code you want search engines to crawl, ideally you want it to perform the same as HTML.

To date, every single time without exception in our tests standard HTML websites outperformed Javascript.

Common Questions we couldn’t quite fit into this article:

Some common questions we get, but couldn’t quite figure out how to lump it in…

What about Prerendering?

This is certainly an alternative. Generally we recommending prerendering middleware such as Puppeteer, Prender.io or Rendertron.

Our opinion is basically its a good option if you’ve already created a JS based website. However, from out tests to-date it doesn’t perform as well as standard HTML. Pretty much, it’s worth testing, but we often prefer to use it other strategies.

Is structured data important for your web page and Google?

Yes. However it’s an entirely different conversation, structured data is important for search engine organic traffic and JSON is javascript. Meaning, you should view your structured data efforts and having Google crawl javascript as two different issues.

As far as our testing goes, Search engines, such as Googe, have no issues crawling structured data.

Can Google Crawler’s index Javascript?

Yes, however as we discussed previously, Crawlers indexing Javascript and outperforming a competing web page are two different topics.

You can easily index javascript, however it if is not properly configured, you will likely not out rank competing websites that enable server side rendering.

I’m not a tech’ie, What is the DOM and why does it matter for Javascript?

In short, Server-side rendering and client side rendering are two different things.

How doe Search Engines view Client-side Rendering? (In the DOM)

Client side rendering is when you open something in Chrome, Safari or Opera. Any thing that is currently on your computer. Generally speaking, this is the environment where Javascript lives in breathes.

In our opinion, this is where many search engine road blocks come up. After all, you are asking search engines to take an extra step. First they must visit your website, then their crawlers have to render the javascript on the page. Once this process is done, they will understand what your web page is about.

How doe Search Engines view Server side Rendering? (Not In the DOM)

To give an easy example, platforms like WordPress render server-side.

In this use case, there are no extra steps, Googlebot comes to your website, crawls it, and reads everything as plan HTML.

No extra steps, no guessing at how Google is indexing your Javascript.

This is the advantage. You have a very clear picture of how Search Engines are crawling your web page.

Javascript Crawling: Takeaways & Final Thoughts

To crawl Javascript has issues, it is however possible.

Here are a few thoughts you should take into account whenever working on Javascript based application:

  • Can you create a 1:1 ratio with HTML? Meaning can you configure your Javascript webpage for crawlers, the same way a Google Bot would crawl HTML.
  • Use Google Chrome & Search Console to troubleshoot any crawling issues or search engine roadblocks.
  • Are competing websites that a built in Javascript, ranking because they are use Javascript or because they are hyper authoritative? If your ahrefs DR is low and a competitive website has a DR90, likley they have a huge advantage. Many web developers believe they have a configuration issue, whereas the competing website is simply more authoritative in the eyes of Google, Bing, etc.

Still stuck? Reach out to us for help.

oh my crawl logo
Digital Architecture
For Search Engines
Contact
Brooklyn, NY 11219

Blog