Python, want logging with log rotation and compression

后端 未结 9 939
暖寄归人
暖寄归人 2020-11-28 01:43

Can anyone suggest a way in python to do logging with:

  • log rotation every day
  • compression of logs when they\'re rotated
  • optional - delete old
相关标签:
9条回答
  • 2020-11-28 02:23
    • log rotation every day: Use a TimedRotatingFileHandler
    • compression of logs: Set the encoding='bz2' parameter. (Note this "trick" will only work for Python2. 'bz2' is no longer considered an encoding in Python3.)
    • optional - delete oldest log file to preserve X MB of free space. You could (indirectly) arrange this using a RotatingFileHandler. By setting the maxBytes parameter, the log file will rollover when it reaches a certain size. By setting the backupCount parameter, you can control how many rollovers are kept. The two parameters together allow you to control the maximum space consumed by the log files. You could probably subclass the TimeRotatingFileHandler to incorporate this behavior into it as well.

    Just for fun, here is how you could subclass TimeRotatingFileHandler. When you run the script below, it will write log files to /tmp/log_rotate*.

    With a small value for time.sleep (such as 0.1), the log files fill up quickly, reach the maxBytes limit, and are then rolled over.

    With a large time.sleep (such as 1.0), the log files fill up slowly, the maxBytes limit is not reached, but they roll over anyway when the timed interval (of 10 seconds) is reached.

    All the code below comes from logging/handlers.py. I simply meshed TimeRotatingFileHandler with RotatingFileHandler in the most straight-forward way possible.

    import time
    import re
    import os
    import stat
    import logging
    import logging.handlers as handlers
    
    
    class SizedTimedRotatingFileHandler(handlers.TimedRotatingFileHandler):
        """
        Handler for logging to a set of files, which switches from one file
        to the next when the current file reaches a certain size, or at certain
        timed intervals
        """
    
        def __init__(self, filename, maxBytes=0, backupCount=0, encoding=None,
                     delay=0, when='h', interval=1, utc=False):
            handlers.TimedRotatingFileHandler.__init__(
                self, filename, when, interval, backupCount, encoding, delay, utc)
            self.maxBytes = maxBytes
    
        def shouldRollover(self, record):
            """
            Determine if rollover should occur.
    
            Basically, see if the supplied record would cause the file to exceed
            the size limit we have.
            """
            if self.stream is None:                 # delay was set...
                self.stream = self._open()
            if self.maxBytes > 0:                   # are we rolling over?
                msg = "%s\n" % self.format(record)
                # due to non-posix-compliant Windows feature
                self.stream.seek(0, 2)
                if self.stream.tell() + len(msg) >= self.maxBytes:
                    return 1
            t = int(time.time())
            if t >= self.rolloverAt:
                return 1
            return 0
    
    
    def demo_SizedTimedRotatingFileHandler():
        log_filename = '/tmp/log_rotate'
        logger = logging.getLogger('MyLogger')
        logger.setLevel(logging.DEBUG)
        handler = SizedTimedRotatingFileHandler(
            log_filename, maxBytes=100, backupCount=5,
            when='s', interval=10,
            # encoding='bz2',  # uncomment for bz2 compression
        )
        logger.addHandler(handler)
        for i in range(10000):
            time.sleep(0.1)
            logger.debug('i=%d' % i)
    
    demo_SizedTimedRotatingFileHandler()
    
    0 讨论(0)
  • 2020-11-28 02:24

    Here is my solution(modified from evgenek), simple and does not block python code while gzipping huge log files:

    class GZipRotator:
        def __call__(self, source, dest):
            os.rename(source, dest)
            subprocess.Popen(['gzip', dest])
    
    0 讨论(0)
  • 2020-11-28 02:30

    In addition to unutbu's answer: here's how to modify the TimedRotatingFileHandler to compress using zip files.

    import logging
    import logging.handlers
    import zipfile
    import codecs
    import sys
    import os
    import time
    import glob
    
    
    class TimedCompressedRotatingFileHandler(logging.handlers.TimedRotatingFileHandler):
        """
        Extended version of TimedRotatingFileHandler that compress logs on rollover.
        """
        def doRollover(self):
            """
            do a rollover; in this case, a date/time stamp is appended to the filename
            when the rollover happens.  However, you want the file to be named for the
            start of the interval, not the current time.  If there is a backup count,
            then we have to get a list of matching filenames, sort them and remove
            the one with the oldest suffix.
            """
    
            self.stream.close()
            # get the time that this sequence started at and make it a TimeTuple
            t = self.rolloverAt - self.interval
            timeTuple = time.localtime(t)
            dfn = self.baseFilename + "." + time.strftime(self.suffix, timeTuple)
            if os.path.exists(dfn):
                os.remove(dfn)
            os.rename(self.baseFilename, dfn)
            if self.backupCount > 0:
                # find the oldest log file and delete it
                s = glob.glob(self.baseFilename + ".20*")
                if len(s) > self.backupCount:
                    s.sort()
                    os.remove(s[0])
            #print "%s -> %s" % (self.baseFilename, dfn)
            if self.encoding:
                self.stream = codecs.open(self.baseFilename, 'w', self.encoding)
            else:
                self.stream = open(self.baseFilename, 'w')
            self.rolloverAt = self.rolloverAt + self.interval
            if os.path.exists(dfn + ".zip"):
                os.remove(dfn + ".zip")
            file = zipfile.ZipFile(dfn + ".zip", "w")
            file.write(dfn, os.path.basename(dfn), zipfile.ZIP_DEFLATED)
            file.close()
            os.remove(dfn)
    
    if __name__=='__main__':
        ## Demo of using TimedCompressedRotatingFileHandler() to log every 5 seconds,
        ##     to one uncompressed file and five rotated and compressed files
    
        os.nice(19)   # I always nice test code
    
        logHandler = TimedCompressedRotatingFileHandler("mylog", when="S",
            interval=5, backupCount=5) # Total of six rotated log files, rotating every 5 secs
        logFormatter = logging.Formatter(
            fmt='%(asctime)s.%(msecs)03d %(message)s', 
            datefmt='%Y-%m-%d %H:%M:%S'
            )
        logHandler.setFormatter(logFormatter)
        mylogger = logging.getLogger('MyLogRef')
        mylogger.addHandler(logHandler)
        mylogger.setLevel(logging.DEBUG)
    
        # Write lines non-stop into the logger and rotate every 5 seconds
        ii = 0
        while True:
            mylogger.debug("Test {0}".format(ii))
            ii += 1
    
    0 讨论(0)
  • 2020-11-28 02:32

    Be warned: The class signatures have changed in python 3. Here is my working example for python 3.6

    import logging.handlers
    import os
    import zlib
    
    
    def namer(name):
        return name + ".gz"
    
    
    def rotator(source, dest):
        print(f'compressing {source} -> {dest}')
        with open(source, "rb") as sf:
            data = sf.read()
            compressed = zlib.compress(data, 9)
            with open(dest, "wb") as df:
                df.write(compressed)
        os.remove(source)
    
    
    err_handler = logging.handlers.TimedRotatingFileHandler('/data/errors.log', when="M", interval=1,
                                                            encoding='utf-8', backupCount=30, utc=True)
    err_handler.rotator = rotator
    err_handler.namer = namer
    
    logger = logging.getLogger("Rotating Log")
    logger.setLevel(logging.ERROR)
    
    logger.addHandler(err_handler)
    
    0 讨论(0)
  • 2020-11-28 02:36

    I guess it's too late to join the party, but here is what I did. I created a new class inheriting logging.handlers.RotatingFileHandler class and added a couple of lines to gzip the file before moving it.

    https://github.com/rkreddy46/python_code_reference/blob/master/compressed_log_rotator.py

    #!/usr/bin/env python
    
    # Import all the needed modules
    import logging.handlers
    import sys
    import time
    import gzip
    import os
    import shutil
    import random
    import string
    
    __version__ = 1.0
    __descr__ = "This logic is written keeping in mind UNIX/LINUX/OSX platforms only"
    
    
    # Create a new class that inherits from RotatingFileHandler. This is where we add the new feature to compress the logs
    class CompressedRotatingFileHandler(logging.handlers.RotatingFileHandler):
        def doRollover(self):
            """
            Do a rollover, as described in __init__().
            """
            if self.stream:
                self.stream.close()
            if self.backupCount > 0:
                for i in range(self.backupCount - 1, 0, -1):
                    sfn = "%s.%d.gz" % (self.baseFilename, i)
                    dfn = "%s.%d.gz" % (self.baseFilename, i + 1)
                    if os.path.exists(sfn):
                        # print "%s -> %s" % (sfn, dfn)
                        if os.path.exists(dfn):
                            os.remove(dfn)
                        os.rename(sfn, dfn)
                dfn = self.baseFilename + ".1.gz"
                if os.path.exists(dfn):
                    os.remove(dfn)
                # These two lines below are the only new lines. I commented out the os.rename(self.baseFilename, dfn) and
                #  replaced it with these two lines.
                with open(self.baseFilename, 'rb') as f_in, gzip.open(dfn, 'wb') as f_out:
                    shutil.copyfileobj(f_in, f_out)
                # os.rename(self.baseFilename, dfn)
                # print "%s -> %s" % (self.baseFilename, dfn)
            self.mode = 'w'
            self.stream = self._open()
    
    # Specify which file will be used for our logs
    log_filename = "/Users/myname/Downloads/test_logs/sample_log.txt"
    
    # Create a logger instance and set the facility level
    my_logger = logging.getLogger()
    my_logger.setLevel(logging.DEBUG)
    
    # Create a handler using our new class that rotates and compresses
    file_handler = CompressedRotatingFileHandler(filename=log_filename, maxBytes=1000000, backupCount=10)
    
    # Create a stream handler that shows the same log on the terminal (just for debug purposes)
    view_handler = logging.StreamHandler(stream=sys.stdout)
    
    # Add all the handlers to the logging instance
    my_logger.addHandler(file_handler)
    my_logger.addHandler(view_handler)
    
    # This is optional to beef up the logs
    random_huge_data = "".join(random.choice(string.ascii_letters) for _ in xrange(10000))
    
    # All this code is user-specific, write your own code if you want to play around
    count = 0
    while True:
        my_logger.debug("This is the message number %s" % str(count))
        my_logger.debug(random_huge_data)
        count += 1
        if count % 100 == 0:
            count = 0
            time.sleep(2)
    
    0 讨论(0)
  • 2020-11-28 02:37

    To copy the file, gzip the copied file (using epoch time), and then clearing out the existing file in a way that won't upset the logging module:

    import gzip
    import logging
    import os
    from shutil import copy2
    from time import time
    
    def logRoll(logfile_name):
        log_backup_name = logfile_name + '.' + str(int(time()))
        try:
            copy2(logfile_name, log_backup_name)   
        except IOError, err:
            logging.debug(' No logfile to roll')
            return
        f_in = open(log_backup_name, 'rb')
        f_out = gzip.open(log_backup_name + '.gz', 'wb')
        f_out.writelines(f_in)
        f_out.close()
        f_in.close()
        os.remove(log_backup_name)
        f=open(logfile_name, 'w')
        f.close()
    
    0 讨论(0)
提交回复
热议问题