How to prevent search engines from indexing a single page of my website?

前端 未结 7 400
抹茶落季
抹茶落季 2020-12-08 14:13

I don\'t want the search engines to index my imprint page. How could I do that?

相关标签:
7条回答
  • 2020-12-08 14:38

    Create a robots.txt file and set the controls there.

    Here are the docs for google: http://code.google.com/web/controlcrawlindex/docs/robots_txt.html

    0 讨论(0)
  • 2020-12-08 14:47

    You can setup a robots.txt file to try and tell search engines to ignore certain directories.

    See here for more info.

    Basically:

    User-agent: *
    Disallow: /[directory or file here]
    
    0 讨论(0)
  • 2020-12-08 14:47

    A robot wants to vists a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds: you can explicitly disallow :

    User-agent: *
    Disallow: /~joe/junk.html
    

    please visit below link for details robots.txt

    0 讨论(0)
  • 2020-12-08 14:49

    Also you can add following meta tag in HEAD of that page

    <meta name="robots" content="noindex,nofollow" />
    
    0 讨论(0)
  • 2020-12-08 14:52
    <meta name="robots" content="noindex, nofollow">
    

    Just include this line in your <html> <head> tag. Why I'm telling you this because if you use robots.txt file to hide your URLs that might be login pages or other protected URLs that you won't show to someone else or search engines.

    What I can do is just accessing the robots.txt file directly from your website and can see which URLs you have are secret. Then what is the logic behind this robots.txt file?

    The good way is to include the meta tag from above and keep yourself safe from anyone.

    0 讨论(0)
  • 2020-12-08 14:54

    Nowadays, the best method is to use a robots meta tag and set it to noindex,follow:

    <meta name="robots" content="noindex, follow">
    
    0 讨论(0)
提交回复
热议问题