How to protect/monitor your site from crawling by malicious user

后端 未结 9 729

Situation:

  • Site with content protected by username/password (not all controlled since they can be trial/test users)
  • a normal search engine can\'t get at i
9条回答
  •  别那么骄傲
    2021-02-06 19:13

    Point 1 has the problem you have mentioned yourself. Also it doesn't help against a slower crawl of the site, or if it does then it may be even worse for legitimate heavy users.

    You could turn point 2 around and only allow the user-agents you trust. Of course this won't help against a tool that fakes a standard user-agent.

    A variation on point 3 would just be to send a notification to the site owners, then they can decide what to do with that user.

    Similarly for my variation on point 2, you could make this a softer action, and just notify that somebody is accessing the site with a weird user agent.

    edit: Related, I once had a weird issue when I was accessing a URL of my own that was not public (I was just staging a site that I hadn't announced or linked anywhere). Although nobody should have even known this URL but me, all of a sudden I noticed hits in the logs. When I tracked this down, I saw it was from some content filtering site. Turned out that my mobile ISP used a third party to block content, and it intercepted my own requests - since it didn't know the site, it then fetched the page I was trying to access and (I assume) did some keyword analysis in order to decide whether or not to block. This kind of thing might be a tail end case you need to watch out for.

提交回复
热议问题