Victorious SEO Logo

JavaScript SEO: Make Sure Your Site Is Indexable

JavaScript can enhance your user experience — but you need to implement it in a way that won’t affect Google’s ability to index your site. Otherwise, the JS elements that make your site attractive and easy to use could lead to poor rankings. Pay attention to the following to safeguard your SEO.

Sep 8, 2023

9 m read

JavaScript is a crucial part of the modern web, bringing webpages alive with interactive features that can improve the user experience. But if you’re not careful, JavaScript can slow your website down, cause indexing problems, and hurt your site’s ability to rank. So can you still rank well and use JavaScript? Of course! Here’s what you need to know about SEO for JavaScript so you can better optimize your site. 

What Is JavaScript SEO?

JavaScript SEO is the practice of optimizing the JavaScript on a website to maximize the website’s ability to rank in search engines like Google. Since we’re optimizing on-page elements, and since it directly affects technical SEO metrics, JavaScript SEO generally falls into the category of technical SEO

Is JavaScript Bad for SEO? 

There are plenty of JavaScript sites out there. And while JavaScript offers many benefits, it can also negatively impact SEO. JavaScript isn’t inherently bad for SEO, but when implemented incorrectly, it can make it harder for Googlebot to crawl and index pages. Plus, excessive JavaScript use can increase loading times, directly harming ranking ability and the user browsing experience. 

Not all sites use JavaScript in the same way. While some sites utilize JavaScript in their code here and there, others use JavaScript to power core frameworks and features.

For example, JavaScript frameworks like Angular and React can help developers build web applications more efficiently. These frameworks also require much more extensive and complex JavaScript code than the average website. 

Sites built using this app shell model, where UI and data modules are kept apart, require the execution of JavaScript code to display content that’s critical for both visitors and search crawlers. Thus, some sites are more at risk for JavaScript-related SEO problems than others. Sites that depend on JavaScript to load on-page content can experience SEO issues if that content loads properly for visitors but not for search crawlers. 

How Does Google Handle JavaScript?

Before I dive into optimization, let’s look a little closer at how Google actually handles JavaScript. 

Flowchart showing how the Googlebot handles JavaScript

Google processes JavaScript in three phases: crawling, rendering, and indexing. Googlebot begins the process by crawling the URLs in its queue. It sends a request to the server using a mobile user agent and pulls the HTML from the site. Google only has a finite amount of computing resources, and it can only allot so many to crawling any one site (its crawl budget). Google processes the HTML resources first to save crawl resources and defers the page’s JavaScript resources for later crawling by placing them in a render queue. 

Rendering allows Googlebot to execute JavaScript code and see what a user would see if they were browsing the site, making it possible for Googlebot to index it properly. When dealing with sites that are heavy on JavaScript — and especially sites that use the app shell model to display critical information in JavaScript — Googlebot must first execute and render the JavaScript code to learn more about the contents of the page. 

This rendering process creates a delay as the JavaScript code gets kicked into the Web Rendering Services queue, where it awaits processing. While this process used to take a lengthy amount of time, Google recently stated that the rendering delay is actually just 5 seconds on average, with 90% of sites being processed within minutes. Unfortunately, that’s not what SEOs have experience. One study showed that it took Google nine times longer to crawl JavaScript than HTML. Plus, errors, timeouts, or robots.txt settings can still prevent Googlebot from rendering and indexing a page. 

The need to render JavaScript leads Googlebot to index the page in two waves. After using a headless Chromium to render the JavaScript, Googlebot crawls the rendered HTML again and adds any newly discovered URLs to the list for further crawling. It then uses the rendered HTML for indexing the site. 

Rendering JavaScript

It’s not just Googlebot that needs to render your pages. Rendering takes the code on your site and visually generates it so visitors can view it on their browsers. Many JavaScript-related indexing issues happen due to the type of rendering a site uses to display its content. There are several different options when it comes to rendering your JavaScript pages, and some are better for search bots than others. 

Server-Side Rendering

As its name implies, server-side rendering (SSR) happens when the rendering process occurs directly on the server. After rendering, the final HTML web page is then delivered to the browser, where visitors can view it and bots can crawl it. 

Server-side rendering is considered a good choice for SEO because it can reduce content loading times and prevent layout shifts. The server-side approach also helps ensure all of your elements actually render, and client-side technology doesn’t ignore them. 

However, server-side rendering can also increase the time required for a page to accept user inputs. This is why some sites that rely heavily on JavaScript prefer to use SSR on web pages that really matter for SEO, but not on pages where solid functionality is critical. 

Client-Side Rendering

Client-side rendering (CSR) shifts the rendering workload off of the server and onto the client (the browser). Instead of receiving the fully rendered HTML directly from the server, the user instead receives some barebones HTML along with a JavaScript file for their own browser to render.

Because the browser itself needs to handle the rendering load, client-side rendering is generally slower than server-side rendering. This can cause obvious SEO issues since page speed is one of many technical SEO signals that Google uses to rank pages. Furthermore, slower load speeds can also increase bounce rate, and while bounce rate may not be a signal itself, a high one could be indicative of a poor browsing experience and frustrated site visitors. If you’re looking to increase site speed, moving away from client-side rendering might not be a bad idea. 

Dynamic Rendering

Dynamic rendering uses both client-side and server-side rendering at different times. Requests coming from browsers will receive the client-side version of the page, while requests coming from bots that may have trouble with JavaScript will get the server-side version. This protects functionality on the most important pages while making it easier for search crawlers to access those that require indexing. 

Image of a warning and error message about JavaScript

A site with a lot of dynamic content that needs to be frequently updated and re-indexed may benefit from this more flexible rendering style. However, while dynamic rendering may sound like a solid solution to your rendering problems, it’s actually not one Google suggests. In fact, the Google search Central page for JavaScript specifically warns that dynamic rendering is a “workaround” and “not a long-term solution” because of extra complexities and resource requirements. That said, it can still be a short-term fix when needed. 

Static Rendering

Static rendering, also known as pre-rendering, involves generating the HTML content for a page during the build or deployment process rather than at runtime. The pre-rendered HTML files are then served directly to the browser or client upon request.

In static rendering, the server generates the HTML files with all the content and data needed for the page, including dynamic elements. This means that the browser or client receives a fully rendered HTML page without the need for additional processing or JavaScript execution.

The pre-rendered HTML files are easily crawlable by search engine bots, enabling better indexing of the website’s content. Additionally, static rendering can significantly improve page loading times since the content is already present in the HTML file and doesn’t require additional rendering on the client side.

Which Type of Rendering Is Best for SEO?

Google recommends using server-side rendering, static rendering, or combining client-side and server-side rendering via rehydration (kind of similar to dynamic rendering). Google doesn’t prohibit client-side rendering, but since it can be more problematic, it’s not exactly preferred. As the amount of JavaScript in the app or on the page grows, it can negatively affect the page’s interaction to next paint (INP, which will be part of Core Web Vitals in March 2024) when rendered client-side. When it comes to client-side JavaScript, Google advises taking the approach of “serve only what you need, when you need it.”

Tips for Reducing JavaScript SEO Issues

Making your site’s JavaScript SEO-friendly doesn’t have to be super complicated, but there are several best practices you should follow for great results. Here are a few SEO JavaScript tips to help you and your development team craft a JavaScript strategy that won’t harm your rankings. 

1. Make Sure Google Is Indexing JavaScript Content

Don’t trust that Google will automatically render and index your JavaScript content. Take some time to check for yourself by performing a site search for a specific text string on your page set in quotation marks (site: yourdomain.com “specific text). If the page appears, you can rest assured that it’s indexed. 

You can also use several different Google tools (URL Inspection Tool, Mobile-Friendly Test) and third-party tools (Screaming Frog, JetOctopus) to dig a little deeper and test your JavaScript implementation. Check out the “Testing and Troubleshooting” section at the bottom of this guide to learn more about using these tools to check for JavaScript-related indexation errors. 

Lastly, don’t forget robots.txt can prevent search crawlers from accessing specific pages. If Google just won’t index a page, make sure the robots.txt file isn’t disallowing it. Google does not recommend using robots.txt to block JavaScript files as this can affect Googlebot’s ability to properly render on-page content and index the page. 

2. Follow On-Page SEO Best Practices

Just because you’re working with JavaScript instead of HTML doesn’t mean the on-page SEO process will change. All the usual technical and on-page optimizations (tags, titles, attributes, etc.) are still essential. Google has actually suggested developers avoid using JavaScript to create or manage canonical tags

3. Use Effective Internal Links

Without internal links, search bots can’t find all the pages in your site architecture and will have trouble crawling or ranking them. For JavaScript SEO purposes, it’s best to have links in HTML rather than JavaScript so they can be crawled immediately instead of after rendering. 

If you do use JavaScript to enter links dynamically into your code, make sure you still set them up using proper HTML markup. I also recommend using Google’s URL Inspection Tool to check whether the anchor text is present in the final rendered HTML. Additionally, Google recommends avoiding linking with JavaScript event handlers or HTML elements like <div> or <span> as these can cause problems for Googlebot and may prevent it from crawling the link. 

4. Stay Away From Hashes in URLs

SPA (single-page applications) can use fragmented URLs to load different views. However, Google wants web admins to avoid using hashes in fragmented URLs, suggesting that you shouldn’t count on them to work with Googlebot. Instead, they recommend using the History API to load different content based on URL. 

5. Use Lazy-Loading Images

Lazy-loading is the practice of delaying the loading of less-important or non-visible page assets. It’s common for optimizing performance and UX. But if you’re not careful about what you delay and how you do it, you may end up with indexing issues. 

Googlebot doesn’t scroll when looking at content; it just resizes its viewport. This means scripted scroll events may not trigger, and content may not get rendered. Google suggests several different ways to make sure all content on your page is loaded when lazy-loading. 

Image detailing content loading processes.

It’s probably best to leave lazy-loading for your images. Lazy-loading content is risky since it may time out and end up not getting indexed. 

6. Fix Duplicate Content

Google states that duplicate content is not grounds for a manual action unless it’s malicious or deceptive in nature. But it can still eat up your crawl budget, delay indexing, and cause your pages to compete with each other for ranking. JavaScript tends to create several URLs for the same content, so decide which version you want indexed and apply canonical and noindex tags to the rest. 

7. Run Regular Site Audits

As the volume and complexity of a page’s JavaScript code expand, it’s important to check it’s being rendered and indexed properly. Regularly scheduled site audits can help you spot anything that you may have missed during your initial round of implementation testing, so don’t forget to make JavaScript SEO part of your regular SEO checklist

Testing and Troubleshooting

There are several different tools you can use to test whether Google is struggling to index the JavaScript on your website or if your recent Google JavaScript SEO fixes are working. 

Your first stop should be Google’s web tools — specifically the URL Inspection Tool and the Mobile-Friendly Test tool. These tools aren’t perfect, as they generate a version of your page from available resources in real-time, not the same cached version that the renderer uses. But they can still give you a pretty accurate snapshot of how Google is handling your JavaScript. 

The Mobile-Friendly Test Tool allows you to tab between the code on your page and a screenshot of what Google sees, so you can compare the two for JavaScript that may not be executing properly. You can access this feature by clicking “View Tested Page” after the test finishes. Clicking on the “More Info” tab also shows you any potential error messages originating from the JavaScript console and more info about which page resources have failed to load and why. 

Similarly, Google’s URL Inspection Tool provides you with a screenshot of how Googlebot sees your pages so you can visually inspect its elements. It also displays the index status of your pages so you can quickly spot if one of your script-heavy pages hasn’t been indexed and may require attention. 

In addition to these web tools, there are several third-party tools you can use for testing and troubleshooting. Crawler tools like Screaming Frog and JetOctopus can both render JavaScript screenshots of your pages. However, keep in mind that these renderings are not necessarily the same as the ones Googlebot would produce since other crawlers are generating them.

Get Help From Technical SEO Experts

JavaScript SEO has a lot of moving parts. With a dedicated partner, you don’t have to tackle these technical SEO issues alone. Victorious can help you and your dev team ensure your site is properly optimized and your SEO efforts contribute to your business goals. Reach out for a free consultation to learn more.

In This Article

Recommended Reading

7 m read
Blog

Product sales lagging because your pages are sitting on page two of search results? SEO can help you surge past the competition so consumers can find you more easily. This easy-to-follow guide takes some of the guesswork out of search...

9 m read
Blog

Have duplicate content on your website? Use canonical tags and canonical URLS as part of your technical SEO best practices to protect and improve your Google search rankings.

7 m read
Blog

Ecommerce category pages make online shopping easier — and they also represent an important ranking opportunity. Maximize organic search traffic to your category pages with these SEO tips.