问题
Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?
ie
http://www.url.com/default.aspx #allow
http://www.url.com/default.aspx?id=6 #allow
http://www.url.com/default.aspx?id=7 #disallow
回答1:
User-agent: *
Disallow: /default.aspx?id=7 # disallow
Disallow: /default.aspx?id=9 # disallow
Disallow: /default.aspx?id=33 # disallow
etc...
You only need to specify the url's that are disallowed. Everything else is allowed by default.
回答2:
Can just the query variable defined such as
Disallow: /default.aspx?id=*
or better still
Disallow: /?id=
来源:https://stackoverflow.com/questions/1177111/restrict-robot-access-for-specific-query-string-parameter-values