robots.txt to disallow all pages except one? Do they override and cascade?

后端 未结 4 1262
春和景丽
春和景丽 2021-02-05 00:17

I want one page of my site to be crawled and no others.

Also, if it\'s any different than the answer above, I would also like to know the syntax for disallowing ever

4条回答
  •  粉色の甜心
    2021-02-05 01:10

    you can use this below both will work

    User-agent: *
    Allow: /$
    Disallow: /
    

    or

    User-agent: *
    Allow: /index.php
    Disallow: /
    

    the Allow must be before the Disallow because the file is read from top to bottom

    Disallow: / says "disallow anything that starts with a slash." So that means everything on the site.

    The $ means "end of string," like in regular expressions. so the result of Allow : /$ is your homepage /index

提交回复
热议问题