Scrapy - logging to file and stdout simultaneously, with spider names

后端 未结 7 1855
悲&欢浪女
悲&欢浪女 2021-01-30 18:28

I\'ve decided to use the Python logging module because the messages generated by Twisted on std error is too long, and I want to INFO level meaningful messages such

7条回答
  •  栀梦
    栀梦 (楼主)
    2021-01-30 19:09

    You want to use the ScrapyFileLogObserver.

    import logging
    from scrapy.log import ScrapyFileLogObserver
    
    logfile = open('testlog.log', 'w')
    log_observer = ScrapyFileLogObserver(logfile, level=logging.DEBUG)
    log_observer.start()
    

    I'm glad you asked this question, I've been wanting to do this myself.

提交回复
热议问题