How to password protect Scrapyd UI?

和自甴很熟 提交于 2019-12-22 00:49:47

问题


I have my website available to public and there is Scrapyd running at port 6800 like http://website.com:6800/

I do not want anyone to see list of my crawlers. I know anyone can easily guess type up port 6800 and can see whats going on.

I have few questions, answer any of them will help me.

  1. Is there way to password protect Scrapyd UI?
  2. Can I password protect a specific Port on Linux? I know it can be done with IPTables to ONLY ALLOW PARTICULAR IPs but thats not a good solution
  3. Should I make changes to Scrapyd's source-code?
  4. Can I password protect a specific port only via .htaccess?

回答1:


You should bind address of the machine that is going to make calls.

If its the localhost which is going to make calls to the endpoints just bind it to 127.0.0.1 and voila, the address doesn't work for external ips.




回答2:


As of scrapyd version 1.2.0 the default bind address is 127.0.0.1

To add a password protection use this gist which uses nginx as a reverse proxy to add basic authentication to scrapyd.

You may also check scrapyd-authenticated repository.



来源:https://stackoverflow.com/questions/43114728/how-to-password-protect-scrapyd-ui

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!