A non-blocking read on a subprocess.PIPE in Python

后端 未结 29 2653
醉酒成梦
醉酒成梦 2020-11-21 04:49

I\'m using the subprocess module to start a subprocess and connect to its output stream (standard output). I want to be able to execute non-blocking reads on its standard ou

相关标签:
29条回答
  • 2020-11-21 04:56

    You can do this really easily in Twisted. Depending upon your existing code base, this might not be that easy to use, but if you are building a twisted application, then things like this become almost trivial. You create a ProcessProtocol class, and override the outReceived() method. Twisted (depending upon the reactor used) is usually just a big select() loop with callbacks installed to handle data from different file descriptors (often network sockets). So the outReceived() method is simply installing a callback for handling data coming from STDOUT. A simple example demonstrating this behavior is as follows:

    from twisted.internet import protocol, reactor
    
    class MyProcessProtocol(protocol.ProcessProtocol):
    
        def outReceived(self, data):
            print data
    
    proc = MyProcessProtocol()
    reactor.spawnProcess(proc, './myprogram', ['./myprogram', 'arg1', 'arg2', 'arg3'])
    reactor.run()
    

    The Twisted documentation has some good information on this.

    If you build your entire application around Twisted, it makes asynchronous communication with other processes, local or remote, really elegant like this. On the other hand, if your program isn't built on top of Twisted, this isn't really going to be that helpful. Hopefully this can be helpful to other readers, even if it isn't applicable for your particular application.

    0 讨论(0)
  • 2020-11-21 04:56

    This version of non-blocking read doesn't require special modules and will work out-of-the-box on majority of Linux distros.

    import os
    import sys
    import time
    import fcntl
    import subprocess
    
    def async_read(fd):
        # set non-blocking flag while preserving old flags
        fl = fcntl.fcntl(fd, fcntl.F_GETFL)
        fcntl.fcntl(fd, fcntl.F_SETFL, fl | os.O_NONBLOCK)
        # read char until EOF hit
        while True:
            try:
                ch = os.read(fd.fileno(), 1)
                # EOF
                if not ch: break                                                                                                                                                              
                sys.stdout.write(ch)
            except OSError:
                # waiting for data be available on fd
                pass
    
    def shell(args, async=True):
        # merge stderr and stdout
        proc = subprocess.Popen(args, shell=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
        if async: async_read(proc.stdout)
        sout, serr = proc.communicate()
        return (sout, serr)
    
    if __name__ == '__main__':
        cmd = 'ping 8.8.8.8'
        sout, serr = shell(cmd.split())
    
    0 讨论(0)
  • 2020-11-21 04:56

    My problem is a bit different as I wanted to collect both stdout and stderr from a running process, but ultimately the same since I wanted to render the output in a widget as its generated.

    I did not want to resort to many of the proposed workarounds using Queues or additional Threads as they should not be necessary to perform such a common task as running another script and collecting its output.

    After reading the proposed solutions and python docs I resolved my issue with the implementation below. Yes it only works for POSIX as I'm using the select function call.

    I agree that the docs are confusing and the implementation is awkward for such a common scripting task. I believe that older versions of python have different defaults for Popen and different explanations so that created a lot of confusion. This seems to work well for both Python 2.7.12 and 3.5.2.

    The key was to set bufsize=1 for line buffering and then universal_newlines=True to process as a text file instead of a binary which seems to become the default when setting bufsize=1.

    class workerThread(QThread):
       def __init__(self, cmd):
          QThread.__init__(self)
          self.cmd = cmd
          self.result = None           ## return code
          self.error = None            ## flag indicates an error
          self.errorstr = ""           ## info message about the error
    
       def __del__(self):
          self.wait()
          DEBUG("Thread removed")
    
       def run(self):
          cmd_list = self.cmd.split(" ")   
          try:
             cmd = subprocess.Popen(cmd_list, bufsize=1, stdin=None
                                            , universal_newlines=True
                                            , stderr=subprocess.PIPE
                                            , stdout=subprocess.PIPE)
          except OSError:
             self.error = 1
             self.errorstr = "Failed to execute " + self.cmd
             ERROR(self.errorstr)
          finally:
             VERBOSE("task started...")
          import select
          while True:
             try:
                r,w,x = select.select([cmd.stdout, cmd.stderr],[],[])
                if cmd.stderr in r:
                   line = cmd.stderr.readline()
                   if line != "":
                      line = line.strip()
                      self.emit(SIGNAL("update_error(QString)"), line)
                if cmd.stdout in r:
                   line = cmd.stdout.readline()
                   if line == "":
                      break
                   line = line.strip()
                   self.emit(SIGNAL("update_output(QString)"), line)
             except IOError:
                pass
          cmd.wait()
          self.result = cmd.returncode
          if self.result < 0:
             self.error = 1
             self.errorstr = "Task terminated by signal " + str(self.result)
             ERROR(self.errorstr)
             return
          if self.result:
             self.error = 1
             self.errorstr = "exit code " + str(self.result)
             ERROR(self.errorstr)
             return
          return
    

    ERROR, DEBUG and VERBOSE are simply macros that print output to the terminal.

    This solution is IMHO 99.99% effective as it still uses the blocking readline function, so we assume the sub process is nice and outputs complete lines.

    I welcome feedback to improve the solution as I am still new to Python.

    0 讨论(0)
  • 2020-11-21 04:57

    I have often had a similar problem; Python programs I write frequently need to have the ability to execute some primary functionality while simultaneously accepting user input from the command line (stdin). Simply putting the user input handling functionality in another thread doesn't solve the problem because readline() blocks and has no timeout. If the primary functionality is complete and there is no longer any need to wait for further user input I typically want my program to exit, but it can't because readline() is still blocking in the other thread waiting for a line. A solution I have found to this problem is to make stdin a non-blocking file using the fcntl module:

    import fcntl
    import os
    import sys
    
    # make stdin a non-blocking file
    fd = sys.stdin.fileno()
    fl = fcntl.fcntl(fd, fcntl.F_GETFL)
    fcntl.fcntl(fd, fcntl.F_SETFL, fl | os.O_NONBLOCK)
    
    # user input handling thread
    while mainThreadIsRunning:
          try: input = sys.stdin.readline()
          except: continue
          handleInput(input)
    

    In my opinion this is a bit cleaner than using the select or signal modules to solve this problem but then again it only works on UNIX...

    0 讨论(0)
  • 2020-11-21 04:57

    On Unix-like systems and Python 3.5+ there's os.set_blocking which does exactly what it says.

    import os
    import time
    import subprocess
    
    cmd = 'python3', '-c', 'import time; [(print(i), time.sleep(1)) for i in range(5)]'
    p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
    os.set_blocking(p.stdout.fileno(), False)
    start = time.time()
    while True:
        # first iteration always produces empty byte string in non-blocking mode
        for i in range(2):    
            line = p.stdout.readline()
            print(i, line)
            time.sleep(0.5)
        if time.time() > start + 5:
            break
    p.terminate()
    

    This outputs:

    1 b''
    2 b'0\n'
    1 b''
    2 b'1\n'
    1 b''
    2 b'2\n'
    1 b''
    2 b'3\n'
    1 b''
    2 b'4\n'
    

    With os.set_blocking commented it's:

    0 b'0\n'
    1 b'1\n'
    0 b'2\n'
    1 b'3\n'
    0 b'4\n'
    1 b''
    
    0 讨论(0)
  • 2020-11-21 04:57

    Here is a module that supports non-blocking reads and background writes in python:

    https://pypi.python.org/pypi/python-nonblock

    Provides a function,

    nonblock_read which will read data from the stream, if available, otherwise return an empty string (or None if the stream is closed on the other side and all possible data has been read)

    You may also consider the python-subprocess2 module,

    https://pypi.python.org/pypi/python-subprocess2

    which adds to the subprocess module. So on the object returned from "subprocess.Popen" is added an additional method, runInBackground. This starts a thread and returns an object which will automatically be populated as stuff is written to stdout/stderr, without blocking your main thread.

    Enjoy!

    0 讨论(0)
提交回复
热议问题