Scrapy - logging to file and stdout simultaneously, with spider names

后端 未结 7 1842
悲&欢浪女
悲&欢浪女 2021-01-30 18:28

I\'ve decided to use the Python logging module because the messages generated by Twisted on std error is too long, and I want to INFO level meaningful messages such

7条回答
  •  别那么骄傲
    2021-01-30 18:47

    I know this is old but it was a really helpful post since the class still isn't properly documented in the Scrapy docs. Also, we can skip importing logging and use scrapy logs directly. Thanks All!

    from scrapy import log
    
    logfile = open('testlog.log', 'a')
    log_observer = log.ScrapyFileLogObserver(logfile, level=log.DEBUG)
    log_observer.start()
    

提交回复
热议问题