html5-ninja/page-replica

Google has been able to index and address js-based website for a ~decade

yairEO opened this issue · 2 comments

We used to do such things back in the day, but since many years it seems Google SEO has no problem indexing pages which requires JS to load content (from an API for example), so why would anyone want to pre-render now in 2024? it was a huge thing back in 2013...

@yairEO
Yes, Google can render your content handled by JavaScript, but Googlebot allows only a few milliseconds for rendering. If your page isn't rendered within that time frame, it may be penalized.

For this reason, many news and broadcasting media outlets still use prerendering services. I speak from experience as I worked in a large Canadian media company.

Another important factor to consider is that the SEO world isn't limited to Google. Various bots, including those from other search engines and platforms like Facebook, require correctly rendered pages for optimal sharing and visibility.

Lastly, the choice between client-side rendering (CSR) and server-side rendering (SSR) depends on your specific needs. Google Search Console provides valuable metrics and information about your app, so it might be worth considering SSR if that better aligns with your requirements.

@yairEO
you can also try this cmd to know if a web app is using ssr or not , you will be suprised how many web app are doing that

curl -A "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" http://example.com