In scrapy for example if i had two URL\'s that contains different HTML. Now i want to write two individual spiders each for one and want to run both the spiders at
You can try using CrawlerProcess
from scrapy.utils.project import get_project_settings
from scrapy.crawler import CrawlerProcess
from myproject.spiders import spider1, spider2
1Spider = spider1.1Spider()
2Spider = spider2.2Spider()
process = CrawlerProcess(get_project_settings())
process.crawl(1Spider)
process.crawl(2Spider)
process.start()
If you want to see the full log of the crawl, set LOG_FILE
in your settings.py
.
LOG_FILE = "logs/mylog.log"