robots.txt to disallow all pages except one? Do they override and cascade?

后端 未结 4 1269
春和景丽
春和景丽 2021-02-05 00:17

I want one page of my site to be crawled and no others.

Also, if it\'s any different than the answer above, I would also like to know the syntax for disallowing ever

4条回答
  •  醉酒成梦
    2021-02-05 01:10

    If you log into Google Webmaster Tools, from the left panel go to crawling, then go to Fetch as Google. Here you can test how Google will crawl each page.

    In the case of blocking everything but the homepage:

    User-agent: *
    Allow: /$
    Disallow: /
    

    will work.

提交回复
热议问题