问题
Hey so I have about 50 spiders in my project and I'm currently running them via scrapyd server. I'm running into an issue where some of the resources I use get locked and make my spiders fail or go really slow. I was hoping their was some way to tell scrapyd to only have 1 running spider at a time and leave the rest in the pending queue. I didn't see a configuration option for this in the docs. Any help would be much appreciated!
回答1:
This can be controlled by scrapyd settings. Set max_proc to 1
:
max_proc
The maximum number of concurrent Scrapy process that will be started.
来源:https://stackoverflow.com/questions/24960303/change-number-of-running-spiders-scrapyd