I want to stop search engines from crawling my whole website.
I have a web application for members of a company to use. This is hosted on a web server so that the emplo
It is best handled with a robots.txt file, for just bots that respect the file.
To block the whole site add this to robots.txt
in the root directory of your site:
User-agent: *
Disallow: /
To limit access to your site for everyone else, .htaccess
is better, but you would need to define access rules, by IP address for example.
Below are the .htaccess
rules to restrict everyone except your people from your company IP:
Order allow,deny
# Enter your companies IP address here
Allow from 255.1.1.1
Deny from all