问题
I am using CodeIgniter with sessions stored in my database. Over a short period of time, a large amount of sessions are created by bots/spiders, etc.
Is there a way of preventing this? Perhaps via .htaccess?
回答1:
First and foremost you should create a robots.txt file in the web root of the domain to address two issues. First to control the rate at which the website is being crawled which can help prevent a bot/spider from creating a massive number of database connections at the same time. Second to prevent specific bots from crawling the website. use the following defaults, however you might want to add or remove the user agents denied, and adjust the crawl rate
Sample Code:
User-agent: *
Crawl-delay: 10
User-agent: Baiduspider
Disallow: /
User-agent: Sosospider
Disallow: /
There are two important considerations when using /robots.txt:
- robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
- the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use.
来源:https://stackoverflow.com/questions/12241701/how-to-prevent-bots-from-creating-sessions-in-codeigniter