Sharing on social media, the URL does not render any meta data

后端 未结 3 1386
伪装坚强ぢ
伪装坚强ぢ 2020-12-01 21:25

We have built a project (Web Application) in React .net core using react in client-side rendering.

We\'ve used react-helmet for dynamically assigning meta tags.

相关标签:
3条回答
  • 2020-12-01 21:31

    I had the same issue today. I had two React Web applications that need this. Here is how I solved it:

    1. put your preview image in the public folder
    2. still in public folder, Open index.html, add the line <meta property="og:image" content="preview.png"/> or <meta property="og:image" content="%PUBLIC_URL%/preview.png"/>.

    Go to https://www.linkedin.com/post-inspector/ to check if it works.

    I hope this would help!

    0 讨论(0)
  • 2020-12-01 21:46

    The meta tags for Open Graph need to be present in the HTML which is sent back to the client when fetching a URL. Browsers or bots will not wait until the app is rendered on the client side to determine what the metatags are - they will only look at the initially loaded HTML.

    If you need the content of your Open Graph metadata to be dynamic (showing different content depending on the URL, device, browser etc.) you need to add something like react-meta-tags into your server code.

    There are no type definitions available for any of the react meta tags libraries, but you can add your own. It can be a bit tricky, but check out the official documentation and the templates they have provided to get started.

    If you don't need it to be dynamic, you could add the tags into the static parts of the <head>-tag in your index.html.

    0 讨论(0)
  • 2020-12-01 21:47

    Prerender is the only solution. I used a node dependency called "prerender" -> https://github.com/prerender/prerender

    It works enabling a web server wich make http requests. Assigning value to a boolean: window.prerenderReady = true; in your website tells your server when the page is ready to "take the photo" and it returns the Html when so. You need to program an easy script that parses all the site urls and save those html contents to files. Upload them to your server and using .htaccess or similar target the crawlers external-hit-facebook,twitterbot,googlebot, etc.. to show them the prerendered version and 'the real site' to the rest of user-agents.

    It worked for me.

    0 讨论(0)
提交回复
热议问题