I don\'t want the search engines to index my imprint page. How could I do that?
Create a robots.txt file and set the controls there.
Here are the docs for google: http://code.google.com/web/controlcrawlindex/docs/robots_txt.html
You can setup a robots.txt file to try and tell search engines to ignore certain directories.
See here for more info.
Basically:
User-agent: *
Disallow: /[directory or file here]
A robot wants to vists a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds: you can explicitly disallow :
User-agent: *
Disallow: /~joe/junk.html
please visit below link for details robots.txt
Also you can add following meta tag in HEAD of that page
<meta name="robots" content="noindex,nofollow" />
<meta name="robots" content="noindex, nofollow">
Just include this line in your <html>
<head>
tag. Why I'm telling you this because if you use robots.txt file to hide your URLs that might be login pages or other protected URLs that you won't show to someone else or search engines.
What I can do is just accessing the robots.txt file directly from your website and can see which URLs you have are secret. Then what is the logic behind this robots.txt file?
The good way is to include the meta tag from above and keep yourself safe from anyone.
Nowadays, the best method is to use a robots meta tag and set it to noindex,follow
:
<meta name="robots" content="noindex, follow">