logging with filters

前端 未结 4 1816
难免孤独
难免孤独 2020-12-01 05:31

I\'m using Logging (import logging) to log messages.

Within 1 single module, I am logging messages at the debug level my_logger.debug(\'msg\')

相关标签:
4条回答
  • 2020-12-01 05:39

    Just implement a subclass of logging.Filter: http://docs.python.org/library/logging.html#filter-objects. It will have one method, filter(record), that examines the log record and returns True to log it or False to discard it. Then you can install the filter on either a Logger or a Handler by calling its addFilter(filter) method.

    Example:

    class NoParsingFilter(logging.Filter):
        def filter(self, record):
            return not record.getMessage().startswith('parsing')
    
    logger.addFilter(NoParsingFilter())
    

    Or something like that, anyway.

    0 讨论(0)
  • 2020-12-01 05:47

    I found a simpler way using functions in your main script:

    # rm 2to3 messages
    def filter_grammar_messages(record):
        if record.funcName == 'load_grammar':
            return False
        return True
    
    def filter_import_messages(record):
        if record.funcName == 'init' and record.msg.startswith('Importing '):
            return False
        return True
    
    logging.getLogger().addFilter(filter_grammar_messages)  # root
    logging.getLogger('PIL.Image').addFilter(filter_import_messages)
    
    0 讨论(0)
  • 2020-12-01 05:57

    I've found a bit easier way how to filter default logging configuration on following problem using sshtunel module, supressing INFO level messages.

    Default reporting with first 2 undesired records looked as follows:

    2020-11-10 21:53:28,114  INFO       paramiko.transport: Connected (version 2.0, client OpenSSH_7.9p1)
    2020-11-10 21:53:28,307  INFO       paramiko.transport: Authentication (password) successful!
    2020-11-10 21:53:28,441  INFO       |-->QuerySSH: Query execution successful.
    

    Logger configuration update:

    logging.basicConfig(
                level=logging.INFO,
                format='%(asctime)s  %(levelname)-10s %(name)s: %(message)s',
                handlers=[
                    logging.StreamHandler(),
                    logging.FileHandler(self.logging_handler)
                ]
            )
    
            # Filter paramiko.transport debug and info from basic logging configuration
            logger_descope = logging.getLogger('paramiko.transport')
            logger_descope.setLevel(logging.WARN)
    

    And result I am happy with looks like this:

    2020-11-10 22:00:48,755  INFO       |-->QuerySSH: Query execution successful.
    
    0 讨论(0)
  • 2020-12-01 05:58

    Do not use global. It's an accident waiting to happen.

    You can give your loggers any "."-separated names that are meaningful to you.

    You can control them as a hierarchy. If you have loggers named a.b.c and a.b.d, you can check the logging level for a.b and alter both loggers.

    You can have any number of loggers -- they're inexpensive.

    The most common design pattern is one logger per module. See Naming Python loggers

    Do this.

    import logging
    
    logger= logging.getLogger( "module_name" )
    logger_a = logger.getLogger( "module_name.function_a" )
    logger_b = logger.getLogger( "module_name.function_b" )
    
    def function_a( ... ):
        logger_a.debug( "a message" )
    
    def function_b( ... ):
        logger_b.debug( "another message" )
    
    if __name__ == "__main__":
        logging.basicConfig( stream=sys.stderr, level=logging.DEBUG )
        logger_a.setLevel( logging.DEBUG )
        logger_b.setLevel( logging.WARN )
    
        ... etc ...
    
    0 讨论(0)
提交回复
热议问题