Is Your JavaScript Framework Killing Your SEO?

Written by Gabriel Bertolo
May 12, 2026
Javascript and SEO problems

I’ve seen this happen more times than I can count.

A company spends $50K, $75K, sometimes $100K on a brand new website built on React or Angular. It looks incredible. The animations are smooth. The user experience is polished. Everything works perfectly in the browser.

And six weeks later, they realize Google hasn’t indexed half their pages.

Organic traffic is cratering. Revenue pages that used to rank on page one have vanished. The phone stops ringing. And nobody can figure out why because the site looks perfect when you visit it.

The problem is JavaScript. Specifically, the assumption that search engines process JavaScript the same way your browser does. They don’t. And AI crawlers? They don’t process JavaScript at all.

JavaScript frameworks like React, Angular, and Vue create stunning user experiences. They also create some of the most damaging technical SEO problems I encounter in audits. The technology isn’t the problem. The implementation is.

 

How Google Actually Processes JavaScript (The Two-Wave Problem)

When you open a website in Chrome, your browser downloads the HTML, immediately executes the JavaScript, and renders the full page in a fraction of a second. Everything happens seamlessly. You see a complete, interactive page.

Googlebot doesn’t work that way. It operates in two completely separate phases.

Wave 1: Googlebot fetches the raw HTML of your page. For server-rendered sites, this HTML contains all your content. Googlebot reads it, follows the links, and processes everything. Done.

For client-side rendered sites? Wave 1 sees an empty shell. A div tag. A reference to a JavaScript bundle. No content. No links. No metadata. Nothing.

Wave 2: Googlebot queues the page for rendering with a headless version of Chromium. This is when JavaScript actually gets executed and the full page content appears.

The problem is the gap between Wave 1 and Wave 2.

 

Google’s own documentation states that pages “may stay on this queue for a few seconds, but it can take longer than that” before rendering resources are available. Source: Google Search Central

 

“Can take longer than that” is doing a lot of heavy lifting. In practice, that delay can range from minutes to days for large or JavaScript-heavy sites. And during that entire delay, your content doesn’t exist as far as Google is concerned.

Here’s the scale of the problem.

 

Rendering JavaScript is approximately 100x more resource-intensive than processing static HTML. Source: Conductor Academy, cited by ighenatt.es

 

Google has limited rendering resources. They prioritize accordingly. If your site is competing for rendering queue time against millions of other JavaScript-heavy sites, your content may sit in that queue for a long time. And for time-sensitive content or newly launched pages, that delay has a direct cost in visibility.

As of December 2025, Google also clarified that pages returning non-200 HTTP status codes may be excluded from the rendering queue entirely. If your single-page application serves a 200 status for everything and then renders “page not found” via JavaScript, Google might index that error state as valid content. If it serves a proper 404 header but relies on client-side JavaScript to show helpful content, Google may never render that content at all.

 

AI Crawlers Can’t Execute JavaScript. At All.

This is the part that should worry you the most.

 

Testing of major AI crawlers including GPTBot (ChatGPT), ClaudeBot, and PerplexityBot confirms that none of them render client-side JavaScript content. Source: Discovered Labs

 

None of them. Zero. If your site depends on JavaScript to load its main content, it is completely invisible to AI search platforms.

Think about what that means. You could have the best content in your industry. You could be the most authoritative source on your topic. But if that content is locked behind client-side rendering, ChatGPT will never see it. Google AI Overviews will never cite it. Perplexity will never reference it.

This is a massive issue for Generative Engine Optimization. AI visibility is becoming a critical channel for buyer research and discovery. Businesses that are invisible to AI crawlers are leaving an increasingly large portion of potential customers on the table.

We’ve documented how we prepare websites for AI search visibility, and JavaScript rendering is one of the first things we check. If the foundation is broken, nothing else we do on the AI visibility side will matter.

 

The Three Rendering Approaches (And Which One to Use)

Not all JavaScript implementations are created equal. The rendering approach you choose determines whether search engines and AI platforms can access your content.

 

Client-Side Rendering (CSR)

The browser does all the work. The server sends an empty HTML shell with a reference to your JavaScript bundle. The browser downloads the bundle, executes it, and builds the entire page.

This is fine for authenticated dashboards, internal tools, and applications where SEO doesn’t matter. If nobody needs to find the page through search, CSR is a perfectly valid choice.

For any public-facing content you want indexed and visible to AI search? CSR is a problem. Googlebot Wave 1 sees nothing. AI crawlers see nothing. You’re entirely dependent on Google’s rendering queue to eventually process your content.

 

Server-Side Rendering (SSR)

JavaScript executes on the server before the page is sent to the browser. The result is fully rendered HTML that arrives complete.

Googlebot Wave 1 sees everything immediately. All your content, all your links, all your metadata. AI crawlers see everything immediately. No rendering queue. No delays. No gaps.

SSR is the gold standard for JavaScript SEO. If your site uses React, Next.js handles SSR. If you use Vue, Nuxt.js is your answer. For Angular, there’s Angular Universal.

 

Static Site Generation (SSG)

Pages are pre-built at deploy time into static HTML files. No server-side processing needed. No JavaScript rendering needed. The HTML is ready to serve instantly.

SSG delivers the fastest crawl speed, zero rendering queue dependency, and the best Core Web Vitals performance. It’s ideal for blogs, marketing pages, documentation, and any content that doesn’t change per user.

For most small business websites, SSG is actually the best approach. Your service pages don’t change dynamically. Your blog posts don’t need real-time data. Static HTML serves them faster and more reliably than any JavaScript framework.

 

Common JavaScript SEO Problems I Find in Audits

These are the issues that show up over and over again. If your site runs on a JavaScript framework, check for every one of these.

 

Content Invisible to Crawlers

The most common and most damaging problem. Your product descriptions, service information, blog posts, pricing, and other critical content loads via API calls after the initial page load. Googlebot Wave 1 sees an empty page. AI crawlers see an empty page.

The content exists. It works perfectly for users. But search engines never see it.

 

Metadata Set by JavaScript

Your title tags, meta descriptions, and canonical tags must be in the server-rendered HTML. If JavaScript sets them after page load, Google Wave 1 doesn’t see them. This means Google is indexing your pages with missing or default metadata.

I’ve audited sites where every single page had the same title tag because the default title was hardcoded in the HTML and the JavaScript that was supposed to set unique titles ran too late for Googlebot to catch.

 

Internal Links Generated by JavaScript

Navigation menus, footer links, and in-content links generated by JavaScript may not be visible to crawlers during Wave 1. This doesn’t just hurt individual pages. It breaks your entire internal linking strategy, which disrupts how authority flows through your site.

 

Infinite Scroll and “Load More” Buttons

Content behind “load more” buttons or infinite scroll patterns never appears in the initial HTML. Google won’t click your buttons. Content that requires user interaction to appear will not be indexed.

Use standard pagination with crawlable URLs instead. Or ensure the content is server-rendered on initial load.

 

Soft 404 Errors

Single-page applications that serve a 200 HTTP status code for every URL, then render “page not found” via JavaScript. Google may index these error pages as valid content. Your search console gets cluttered with indexed pages that show empty or error states to users.

 

How to Diagnose JavaScript SEO Problems

You don’t need expensive tools to find out if JavaScript is hurting your SEO. Start with these.

View Source vs Inspect Element. Right-click on your page and select “View Page Source.” This shows you the raw HTML that Googlebot sees during Wave 1. Then right-click and select “Inspect.” This shows you the fully rendered page after JavaScript executes. If critical content appears in Inspect but not in View Source, you have a rendering problem.

Google Search Console URL Inspection. Enter any URL and click “Test Live URL.” Search Console will show you both the raw HTML and the rendered HTML. Compare them. If important content, links, or metadata are missing from the raw HTML, that content depends on JavaScript rendering.

Screaming Frog with JavaScript Rendering. Crawl your site with JavaScript rendering enabled and compare it to a crawl without rendering. The differences reveal exactly what content is JavaScript-dependent.

Search Console Coverage Report. Look for indexing errors, pages discovered but not indexed, and crawled but currently not indexed statuses. JavaScript rendering issues often show up here as pages that Google found but couldn’t process.

We walk through this entire diagnostic process in our technical SEO audit checklist.

 

How to Fix It

The fix depends on where you are in the lifecycle of your site.

If you’re building a new site: Choose SSR or SSG from the start. This is the single most important architectural decision you’ll make. Retrofitting server-side rendering into a pure client-side rendered application is significantly harder and more expensive than building it right from day one.

This is something we discuss extensively in our web design process. SEO architecture is part of discovery, not an afterthought bolted on after launch.

If you have an existing CSR site: The most reliable fix for a React application is migrating to Next.js with server-side rendering. For Vue, migrate to Nuxt.js. This gives you SSR and SSG capabilities without rebuilding from scratch.

Interim solution: Dynamic rendering detects the requesting user-agent and serves pre-rendered HTML to crawlers while serving the normal JavaScript app to users. Google calls this a “bridge solution,” not a best practice. Use it as a stopgap while you migrate to SSR or SSG.

Regardless of approach: Get your metadata into server-rendered HTML. Audit every page for content that’s invisible in the raw source. Fix internal links that depend on JavaScript. Test with Google Search Console’s URL Inspection tool.

And whatever you do, don’t let a website redesign destroy your SEO. I’ve written about this specifically because it happens so often. A redesign that moves from a properly rendered site to a JavaScript-heavy framework without SEO considerations can wipe out years of organic equity overnight.

The cost of a cheap website isn’t just bad design. It’s bad architecture that quietly kills your search visibility while looking beautiful on the surface.

 

Frequently Asked Questions

Can Google render JavaScript?

Yes, Google can render JavaScript using its Web Rendering Service, which runs a headless version of Chromium. But rendering happens in a separate queue after the initial crawl, and the delay can range from seconds to days. Content that depends on JavaScript rendering is not guaranteed to be indexed quickly or completely.

Is React bad for SEO?

React itself isn’t bad for SEO. Client-side rendered React applications are bad for SEO. If you use React with Next.js and implement server-side rendering or static site generation, React works perfectly well for SEO. The framework isn’t the problem. The rendering strategy is.

What is the best rendering approach for SEO?

Server-side rendering (SSR) or static site generation (SSG). Both deliver complete HTML to crawlers on the first request, eliminating the two-wave indexing problem entirely. SSG is ideal for content that doesn’t change per user. SSR is better for dynamic, personalized content.

Do AI search engines execute JavaScript?

No. Testing confirms that GPTBot (ChatGPT), ClaudeBot, and PerplexityBot do not execute JavaScript. If your content loads via client-side rendering, it’s invisible to AI search platforms.

How do I check if JavaScript is causing SEO problems?

Compare “View Page Source” (raw HTML) to “Inspect Element” (rendered page) in your browser. If important content appears in Inspect but not in View Source, you have a JavaScript rendering problem. Google Search Console’s URL Inspection tool also shows you exactly what Google sees.

 

Not Sure If JavaScript Is Hurting Your Site?

If you’re running a JavaScript framework and you’re not confident that your rendering strategy is search-engine friendly, that uncertainty alone is a reason to get an audit.

Schedule a technical SEO review, and I’ll show you exactly what Google and AI crawlers see when they visit your site. If everything is fine, you’ll have peace of mind. If it’s not, you’ll have a clear plan to fix it before the damage compounds.

For more on how technical SEO fits into a broader search strategy, read our guide to SEO for small and mid-sized businesses.

Gabriel Bertolo - Founder of Radiant Elephant

Gabriel Bertolo

Gabriel Bertolo is a 3rd generation entrepreneur who founded Radiant Elephant over 13 years ago after working for various advertising and marketing agencies. 

He is also an award-winning Jazz/Funk drummer and composer, as well as a visual artist.

His Web Design, SEO, and Marketing insights have been quoted in Forbes, Business Insider, Hubspot, Entrepreneur, Shopify, MECLABS, and more.

Check out some publications he's been quoted in:

Quoted in HubSpot's AI Search Visibility Article and HubSpot's Article on 6 Best Wix Alternatives

Quoted in DesignRush Dental Marketing Guide 

Quoted in MECLABS 

Quoted in DataBox Website Optimization Article and DataBox Best SEO Blogs

Quoted in Seoptimer

Quoted in Shopify Blog 

})