Facebook and Crawl-delay in Robots.txt?

前端 未结 5 793
旧时难觅i
旧时难觅i 2021-01-02 03:39

Does Facebook\'s webcrawling bots respect the Crawl-delay: directive in robots.txt files?

5条回答
  •  小鲜肉
    小鲜肉 (楼主)
    2021-01-02 04:15

    if you are running on ubuntu server and you are using ufw firewall you may want to try

    ufw limit proto tcp from 31.13.24.0/21 port 80 to any

    for all of these IP addresses: 31.13.24.0/21 31.13.64.0/18 66.220.144.0/20 69.63.176.0/20 69.171.224.0/19 74.119.76.0/22 103.4.96.0/22 173.252.64.0/18 204.15.20.0/22

    as shown here: What's the IP address range of Facebook's Open Graph crawler?

提交回复
热议问题