I\'ve just coded my first website using reactjs, but when I check how google sees my website, I receive the following result:
My HTML file looks like this:
Try adding browser shims. Note that it doesn’t matter if you use Babel to compile your code, you still need polyfills for older browsers and for headless browsers such as Google Bot or PhantomsJS.
npm install --save es5-shim es6-shim
// in your frontend/index.js, as early as possible
import 'es5-shim';
import 'es6-shim';
You can read more here
I would not be sure that is exactly how Google sees your website, as most simulators just strip off Javascript.
Did you use https://www.google.com/webmasters/tools/googlebot-fetch ?
In general Javascript support is limited for search engines so if you really want to have crawlers index your site you would have to implement server side rendering for React.
I've used https://github.com/kriasoft/react-starter-kit to generate http://gifhub.net It was a bit complicated experience but it worked at the end.
There are also frameworks like NextJS https://github.com/zeit/next.js/ that you can leverage to ensure you have server rendered content.
Third option is to use Google Headless Chrome browser to generate content for crawlers https://github.com/GoogleChrome/puppeteer
Having one of these options above implemented makes sure crawlers see everything you wanted. Relying on Javascript rendering will not give you expected results.
Add babel polyfill to your project:
npm install --save babel-polyfill
And then import it in your index.js (entry point):
import 'babel-polyfill';
Hopefully, this will solve your problem.
This appears to be a known issue with Google Bot's JS engine. I'm still trying to understand what exactly the problem is, but it seems that adding 'babel-polyfill' to your app solves the problem.
Medium post detailing a fix
In one of my legacy projects I run Angular.js to insert dynamic content into a backend-rendered page. Google crawler is smart enough to let it render the dynamic javascript content and index it (e.g. the table is completely dynamic rendered from Ajax data).
So I strongly double that it is related to Server-Side Rendering issues.
I wouldn't suggest spending time on doing SSR as @AlexGvozden suggested - it's quite tedious, especially the Webpack setup. Probably even with Next.js and Create React App.
Had the same issue with blank pages at "Fetch as Google", the advice above with babel-polyfill didn't solve the trouble so I did more digging into it:
Long story short, here's the fix that worked for me:
npm install --save babel-polyfill npm install --save url-search-params-polyfill npm install --save whatwg-fetch
import 'babel-polyfill'; import 'url-search-params-polyfill'; import 'whatwg-fetch' import React from 'react'; import ReactDOM from 'react-dom';* ...