I have the following chunk of Python code (running v2.7) that results in MemoryError
exceptions being thrown when I work with large (several GB) files:
What I would probably do instead, if I needed to read the stdout for something that large, is send it to a file on creation of the process.
with open(my_large_output_path, 'w') as fo:
with open(my_large_error_path, 'w') as fe:
myProcess = Popen(myCmd, shell=True, stdout=fo, stderr=fe)
Edit: If you need to stream, you could try making a file-like object and passing it to stdout and stderr. (I haven't tried this, though.) You could then read (query) from the object as it's being written.