Running Multiple spiders in scrapy

前端 未结 4 1377
滥情空心
滥情空心 2021-01-05 04:27
  1. In scrapy for example if i had two URL\'s that contains different HTML. Now i want to write two individual spiders each for one and want to run both the spiders at

4条回答
  •  太阳男子
    2021-01-05 05:27

    You can try using CrawlerProcess

    from scrapy.utils.project import get_project_settings
    from scrapy.crawler import CrawlerProcess
    
    from myproject.spiders import spider1, spider2
    
    1Spider = spider1.1Spider()
    2Spider = spider2.2Spider()
    process = CrawlerProcess(get_project_settings())
    process.crawl(1Spider)
    process.crawl(2Spider)
    process.start()
    

    If you want to see the full log of the crawl, set LOG_FILE in your settings.py.

    LOG_FILE = "logs/mylog.log"
    

提交回复
热议问题