How to update facebook open graph image

后端 未结 4 2059
轻奢々
轻奢々 2020-12-23 17:16

Say if you have set the facebook image for your webpage via the meta tag of the open graph protocol like this:

相关标签:
4条回答
  • 2020-12-23 17:33

    this is the most consistent answer I've found to this problem: https://stackoverflow.com/a/21468084/339698

    If you're too lazy to go to that link, you can POST an ajax request with the url you are trying to clear:

    $.post(
        'https://graph.facebook.com',
        {
            id: 'http://www.site.com/my/share/url/',
            scrape: true
        },
        function(response){
            console.log(response);
        }
    );
    
    0 讨论(0)
  • 2020-12-23 17:34

    if you are using any cache plugin, make sure you clear all your caches. Also make sure the image you are using has the recommended facebook size: 1200(w) x 630(h) or 600 x 315.

    0 讨论(0)
  • 2020-12-23 17:35

    Facebook stores your image into their own image on their servers, and then caches it for 24h. The cache delay might change in the future, so to check it just open the image that facebook creates from your image and check its "max-age" value in the http headers. So if you change your image, facebook will not update its version until 24h even if you use this link http://developers.facebook.com/tools/debug to force the data fetching.

    To solve this, if you need to see the effect of your change right away you should rename your image. So if the old version was rock.jpg name it rock2.jpg then use this link http://developers.facebook.com/tools/debug to get facebook to create a new image from your updated image. This will update immediately your webpage's facebook image in the facebook share.

    0 讨论(0)
  • 2020-12-23 17:38

    Years later after this post was made and this is still a problem, but its not facebook's cache: It is quite often human error (allow me to elaborate)

    OG:TYPE effects your image scrape:

    1. https://ogp.me/#type_article not the same as https://ogp.me/#type_website

    Be aware that og:type=website will cause any /sub-pages/ of that url to become "canonical". This means you will have trouble getting your images to update using the scraper no matter what you do.

    Consider this "assumption and common mistake"

    -<meta property="og:type" content="website" /> => https://www.example.org (parent)
    -<meta property="og:type" content="website" /> => https://www.example.org/sub-page/
    -<meta property="og:type" content="website" /> => https://www.example.org/sub-page/child-2/
    - Ergo: /sub-page/ and /child-2/ will inherit the og:image of the parent

    Those are not "all websites", 1 is a website, the others are articles.

    If you do that Facebook will think all of those are canonical and it will put the FIRST og:image into all of them. (try it, you'll see) - if you set the og:url to be your root or parent domain you've told facebook they are all canonical. (there is good reason for that, but its off topic)

    Consider this solution (which is what most people "really want")

    -<meta property="og:type" content="article" /> => https://www.example.org/sub-page/
    -<meta property="og:type" content="article" /> => https://www.example.org/sub-page/child-2/

    If you do that now Facebook will give you far far less problems with scraping your NEW images.

    In closing, YES the cache busters, random vars, changing urls and suggestions here can work, but they will seem like "intermittent voodoo" if the og:type is not specified correctly.

    PS: remember that a CDN or serverside cache will serve to Facebook's scraper even if you "think" you can see the most recent version. (I wont spend any time on this other than to point out it will waste colossal amounts of your time if not double checked.)

    0 讨论(0)
提交回复
热议问题