I am having trouble with logging in scrapy, and most of what I can find is out of date.
I have set LOG_FILE=\"log.txt\"
in the settings.py
For logging I just put this on the spider class:
import logging
from scrapy.utils.log import configure_logging
class SomeSpider(scrapy.Spider):
configure_logging(install_root_handler=False)
logging.basicConfig(
filename='log.txt',
format='%(levelname)s: %(message)s',
level=logging.INFO
)
This will put all scrapy output into the project root directory as a log.txt
file
If you want to log something manually you shouldn't use the scrapy logger, it's deprecated. Just use the python one
import logging
logging.error("Some error")
I was unable to make @Rafael Almeda's solution work until I added the following to the import section of my spider.py code:
from scrapy.utils.log import configure_logging
It seems that you're not calling your parse_page
method at any time.
Try to commenting your parse
method and you're going to receive a NotImplementedError
because you're starting it and you're saying it 'do nothing'.
Maybe if you implement your parse_page
method it'll work
def parse(self, response):
self.logger.info('Parse function called on %s', response.url)
self.parse_page(response)
Hope it helps you.