In scrapy for example if i had two URL\'s that contains different HTML. Now i want to write two individual spiders each for one and want to run both the spiders at
It would probably be easiest to just run two scrapy scripts at once from the OS level. They should both be able to save to the same database. Create a shell script to call both scrapy scripts to do them at the same time:
scrapy runspider foo &
scrapy runspider bar
Be sure to make this script executable with chmod +x script_name
To schedule a cronjob every 6 hours, type crontab -e
into your terminal, and edit the file as follows:
* */6 * * * path/to/shell/script_name >> path/to/file.log
The first * is minutes, then hours, etc., and an asterik is a wildcard. So this says run the script at any time where the hours is divisible by 6, or every six hours.