Running multiple spiders using scrapyd

会有一股神秘感。 提交于 2020-01-02 07:24:07

问题


I had multiple spiders in my project so decided to run them by uploading to scrapyd server. I had uploaded my project succesfully and i can see all the spiders when i run the command

curl http://localhost:6800/listspiders.json?project=myproject

when i run the following command

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider2

Only one spider runs because of only one spider given, but i want to run run multiple spiders here so the following command is right for running multiple spiders in scrapyd ?

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1,spider2,spider3........

And later i will run this command using cron job i mean i will schedule this to run frequently


回答1:


If you want to run multiple spiders using scrapyd, schedule them one by one. scrapyd will run them in the same order but not at the same time.

See also: Scrapy 's Scrapyd too slow with scheduling spiders



来源:https://stackoverflow.com/questions/11390888/running-multiple-spiders-using-scrapyd

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!