Constantly print Subprocess output while process is running

后端 未结 13 803
庸人自扰
庸人自扰 2020-11-22 06:51

To launch programs from my Python-scripts, I\'m using the following method:

def execute(command):
    process = subprocess.Popen(command, shell=True, stdout=s         


        
相关标签:
13条回答
  • 2020-11-22 07:23

    For anyone trying the answers to this question to get the stdout from a Python script note that Python buffers its stdout, and therefore it may take a while to see the stdout.

    This can be rectified by adding the following after each stdout write in the target script:

    sys.stdout.flush()
    
    0 讨论(0)
  • 2020-11-22 07:24

    @tokland

    tried your code and corrected it for 3.4 and windows dir.cmd is a simple dir command, saved as cmd-file

    import subprocess
    c = "dir.cmd"
    
    def execute(command):
        popen = subprocess.Popen(command, stdout=subprocess.PIPE,bufsize=1)
        lines_iterator = iter(popen.stdout.readline, b"")
        while popen.poll() is None:
            for line in lines_iterator:
                nline = line.rstrip()
                print(nline.decode("latin"), end = "\r\n",flush =True) # yield line
    
    execute(c)
    
    0 讨论(0)
  • 2020-11-22 07:34

    To print subprocess' output line-by-line as soon as its stdout buffer is flushed in Python 3:

    from subprocess import Popen, PIPE, CalledProcessError
    
    with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
        for line in p.stdout:
            print(line, end='') # process line here
    
    if p.returncode != 0:
        raise CalledProcessError(p.returncode, p.args)
    

    Notice: you do not need p.poll() -- the loop ends when eof is reached. And you do not need iter(p.stdout.readline, '') -- the read-ahead bug is fixed in Python 3.

    See also, Python: read streaming input from subprocess.communicate().

    0 讨论(0)
  • 2020-11-22 07:37

    This PoC constantly reads the output from a process and can be accessed when needed. Only the last result is kept, all other output is discarded, hence prevents the PIPE from growing out of memory:

    import subprocess
    import time
    import threading
    import Queue
    
    
    class FlushPipe(object):
        def __init__(self):
            self.command = ['python', './print_date.py']
            self.process = None
            self.process_output = Queue.LifoQueue(0)
            self.capture_output = threading.Thread(target=self.output_reader)
    
        def output_reader(self):
            for line in iter(self.process.stdout.readline, b''):
                self.process_output.put_nowait(line)
    
        def start_process(self):
            self.process = subprocess.Popen(self.command,
                                            stdout=subprocess.PIPE)
            self.capture_output.start()
    
        def get_output_for_processing(self):
            line = self.process_output.get()
            print ">>>" + line
    
    
    if __name__ == "__main__":
        flush_pipe = FlushPipe()
        flush_pipe.start_process()
    
        now = time.time()
        while time.time() - now < 10:
            flush_pipe.get_output_for_processing()
            time.sleep(2.5)
    
        flush_pipe.capture_output.join(timeout=0.001)
        flush_pipe.process.kill()
    

    print_date.py

    #!/usr/bin/env python
    import time
    
    if __name__ == "__main__":
        while True:
            print str(time.time())
            time.sleep(0.01)
    

    output: You can clearly see that there is only output from ~2.5s interval nothing in between.

    >>>1520535158.51
    >>>1520535161.01
    >>>1520535163.51
    >>>1520535166.01
    
    0 讨论(0)
  • 2020-11-22 07:37

    None of the answers here addressed all of my needs.

    1. No threads for stdout (no Queues, etc, either)
    2. Non-blocking as I need to check for other things going on
    3. Use PIPE as I needed to do multiple things, e.g. stream output, write to a log file and return a string copy of the output.

    A little background: I am using a ThreadPoolExecutor to manage a pool of threads, each launching a subprocess and running them concurrency. (In Python2.7, but this should work in newer 3.x as well). I don't want to use threads just for output gathering as I want as many available as possible for other things (a pool of 20 processes would be using 40 threads just to run; 1 for the process thread and 1 for stdout...and more if you want stderr I guess)

    I'm stripping back a lot of exception and such here so this is based on code that works in production. Hopefully I didn't ruin it in the copy and paste. Also, feedback very much welcome!

    import time
    import fcntl
    import subprocess
    import time
    
    proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    
    # Make stdout non-blocking when using read/readline
    proc_stdout = proc.stdout
    fl = fcntl.fcntl(proc_stdout, fcntl.F_GETFL)
    fcntl.fcntl(proc_stdout, fcntl.F_SETFL, fl | os.O_NONBLOCK)
    
    def handle_stdout(proc_stream, my_buffer, echo_streams=True, log_file=None):
        """A little inline function to handle the stdout business. """
        # fcntl makes readline non-blocking so it raises an IOError when empty
        try:
            for s in iter(proc_stream.readline, ''):   # replace '' with b'' for Python 3
                my_buffer.append(s)
    
                if echo_streams:
                    sys.stdout.write(s)
    
                if log_file:
                    log_file.write(s)
        except IOError:
            pass
    
    # The main loop while subprocess is running
    stdout_parts = []
    while proc.poll() is None:
        handle_stdout(proc_stdout, stdout_parts)
    
        # ...Check for other things here...
        # For example, check a multiprocessor.Value('b') to proc.kill()
    
        time.sleep(0.01)
    
    # Not sure if this is needed, but run it again just to be sure we got it all?
    handle_stdout(proc_stdout, stdout_parts)
    
    stdout_str = "".join(stdout_parts)  # Just to demo
    

    I'm sure there is overhead being added here but it is not a concern in my case. Functionally it does what I need. The only thing I haven't solved is why this works perfectly for log messages but I see some print messages show up later and all at once.

    0 讨论(0)
  • 2020-11-22 07:38

    Ok i managed to solve it without threads (any suggestions why using threads would be better are appreciated) by using a snippet from this question Intercepting stdout of a subprocess while it is running

    def execute(command):
        process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    
        # Poll process for new output until finished
        while True:
            nextline = process.stdout.readline()
            if nextline == '' and process.poll() is not None:
                break
            sys.stdout.write(nextline)
            sys.stdout.flush()
    
        output = process.communicate()[0]
        exitCode = process.returncode
    
        if (exitCode == 0):
            return output
        else:
            raise ProcessException(command, exitCode, output)
    
    0 讨论(0)
提交回复
热议问题