Compared to regular HTML/CSS based web pages, React.JS (without server-side rendering at least) is slightly worse from an SEO perspective. Although, this might change in the future.
Sure, having fast, responsive website with good url structure and all is important. But the most important thing is that search engines crawl your content in the first place.
Here’s the thing, most search engines have crawlers that first look at the HTML/CSS content to index your website. With sites developed with React, most content is generated by JavaScript code. Only HTML that the crawler can immediately see is a single div tag with nothing in it.
Sure, search engines have JS renderers too, but for them it is expensive and most search engines other than google don’t bother. (At least for now.)
So, you pretty much lose all your traffic from Bing / Yahoo / Other Search Engines to your competitors who don’t use SPA’s.
In case of google, things are a bit different. Google’s crawlers are advanced enough to be able to render the JavaScript to read and index your website’s content.
But it does that in two waves.
The first wave requests source code, crawls and indexes any present HTML and CSS, adds any present links to the crawl queue and downloads page response codes.
The second wave can occur a few hours to even a few weeks later, Google returns to the page when additional resources are available to fully render and index the JS generated content.
This was revealed at a 2018 Google I/O.
This means client-rendered SPA’s are definitely at a disadvantage even when it comes to search giant like google. The regular HTML content is crawled much more often that the JS-based content. Which puts your site at a certain disadvantage from an SEO perspective. Just my thoughts.
Top comments (0)