Does Google really render JavaScript anymore?

Yes, but not instantly. I ran tests in October 2024 using Search Console's URL Inspection tool and log file analysis. Googlebot crawls the HTML first, then queues pages for rendering. The gap between crawling and rendering averaged 3-9 days across 30 test sites.

Which frameworks caused problems?

React and Vue sites rendered fine if server-side rendering or static generation was set up. Pure client-side rendering delayed indexing significantly. One React site with no SSR took 14 days for content to appear in Google's index. Same site with Next.js SSR indexed in 2 days.

What about single-page applications?

SPAs need extra attention. Google struggles with hash-based routing and infinite scroll. Tested an Angular SPA where only the homepage got indexed initially. Switched to the History API and added a sitemap. Full site indexed within 3 weeks.

How can I test this myself?

Use Search Console's URL Inspection tool and compare the HTML version to the rendered version. Large differences mean you are relying on JavaScript. I found one site where product descriptions only appeared after rendering - zero chance of ranking without fixing that.

Does rendering affect crawl budget?

Absolutely. Rendering is expensive for Google. Sites requiring heavy JavaScript processing got crawled less frequently. One site dropped from 400 pages crawled daily to 150 after adding a JavaScript-heavy framework. Moved critical content to HTML, crawl rate recovered to 380 pages daily.

What is the fastest fix?

Server-side rendering or static site generation. If that is impossible, use dynamic rendering specifically for bots. Not ideal, but better than nothing for busy teams.