In this article we will discuss how to Simulate Googlebot. We’ll review two similar methods to analyze any site’s performance and search engine optimization with respect to how Google interprets your site specifically. Although, these methods also have the options to render with more options. We just recommend Google since they are widely used and have extremely advanced crawlers.
We also cover extra methods in this article where we check serverside rendering.
What are the advantages of Simulating and Viewing as googlebot?
Simulating through GoogleBot allows anybody with the right experience the opportunity to solve SEO problems much easier, and for newcomers, allows them to see how GoogleBot interacts with the sites it crawls.
Another benefit is having concrete evidence or data to use as a tool for when you make important decisions to change crucial parts of applications. This makes it easier to have a team of developers, clients, etc. to be convinced of the root cause of a search engine optimization problem and fix it.
Lots of times it is extremely hard to explain or find evidence for common SEO issues just because Google and Bing for example, when crawling Javascript are a bit of a black box when they crawl these types of sites.
Meaning, with Javascript, compared to standard HTML websites (ie websites that render from the server) they are a bit hard to interpret and diagnose for SEO roadblocks
However, using the methods we will show in this article, you’ll learn whot to simulate googlebot and diagnose these types of websites for SEO issues.
GoogleBot Simulating & Rendering Tutorial
Method 1: Fetch With Chrome
This method allows you to change the device loading any website directly within Google Chrome to gain an understanding of how systems like smartphones, different browsers, etc. process and load content on your site.
We will specifically be using the GoogleBot options which also include a smart phone option, which is widely used across all cellular devices.
How Fetch As Googlebot
- Travel to your desired website, and right-click within the site contents
- Proceed by selecting “Inspect”, sometimes denoted as “Inspect Element”
- This will reveal a Popup with multiple tabs for you to explore the components and interactions of your site behind the scenes
- For this exercise, we are focusing on utilizing the “Networks” tab, so you can ignore the rest and select that from the options at the top of the popup
- Within the “Network” tab window, another tab selection will reveal itself directly under
- Proceed by selecting the button (normally used to denote Wifi signals)
- This will change the options available at the bottom of the window
- Proceed by finding the “User agent” section within the bottom window and uncheck the “use browser default” box
- This then gives you the option to change the software being used to load the site.
- Once you play around with the devices, you can troubleshoot any issues you may have using method 1 to get a better understanding of what is being sent to the browser.
- Switch between the different GoogleBot Options. Both Smartphone & desktop.
What are the advantages of this method?
Everytime you click on a link to make another request to the site you can see exactly how GoogleBot is interacting and loading the website. This should give you an idea of what components are being rendered on the client and server side so that further decision making can be done.
Method #2 Use Screaming Frog to Crawl as GoogleBot
This method is essentially the same as the first, but we are given an interface which shows us the actual data being processed within the GoogleBot in the table with the page results from the site.
Screaming Frog is a website crawler that extracts data and audits for common SEO issues. It is rather simple to use, but maybe a bit harder to interpret.
You can start by entering your website of choice within the upmost “URL” TextBox, and you will be welcomed with a list of page results from your website along with other attributes as the web crawler does its thing.
You even have the ability to change the User-Agent by:
- Selecting “Configuration” at the top of the toolbar
- Then selecting, “User-Agent”
- And proceeding by changing the user agent in the popup window
You can play around with this a bit more to get an understanding of the app.
What are the advantages of this method?
This method allows you to take a look at certain attributes of the pages within your site to further help the decision making process for implementing changes to an application. For example, you can take a look at the order of how the application loaded the pages of your website and interpret that the ones at the top were more accessible for some reason than the ones at the bottom and do some more digging from there.
Final Thoughts
Talk about the advantage of Crawling with Googlebot. Mainly, see you get to visually experience exactly how Google gets to make a request and load any website on the internet and gain a further understanding of the true depth of what search engine optimization really is.