Python memory allocation error using subprocess.Popen

后端 未结 4 1197
清歌不尽
清歌不尽 2020-12-06 12:08

I am doing some bioinformatics work. I have a python script that at one point calls a program to do an expensive process (sequence alignment..uses a lot of computational pow

相关标签:
4条回答
  • 2020-12-06 12:43

    This doesn't have anything to do with Python or the subprocess module. subprocess.Popen is merely reporting to you the error that it is receiving from the operating system. (What operating system are you using, by the way?) From man 2 fork on Linux:

    ENOMEM    fork()  failed  to  allocate  the  necessary  kernel  structures
              because memory is tight.
    

    Are you calling subprocess.Popen multiple times? If so then I think the best you can do is make sure that the previous invocation of your process is terminated and reaped before the next invocation.

    0 讨论(0)
  • 2020-12-06 12:50

    I feel really sorry for the OP. 6 years later and no one mentioned that this is a very common problem in Unix, and actually has nothing to do with python or bioinformatics. A call to os.fork() temporarily doubles the memory of the parent process (the memory of the parent process must be available to the child process), before throwing it all away to do an exec(). While this memory isn't always actually copied, the system must have enough memory to allow for it to be copied, and thus if you're parent process is using more than half of the system memory and you subprocess out even "wc -l", you're going to run into a memory error.

    The solution is to use posix_spawn, or create all your subprocesses at the beginning of the script, while memory consumption is low, then use them later on after the parent process has done it's memory-intensive thing.

    A google search using the keyworks "os.fork" and "memory" will show several Stack Overflow posts on the topic that can further explain what's going on :)

    0 讨论(0)
  • 2020-12-06 12:53

    Do you use subprocess.PIPE? I had problems and read about problems when it was used. Temporary files usually solved the problem.

    0 讨论(0)
  • 2020-12-06 13:02

    I'd run a 64 bit python on a 64 bit OS.

    With 32 bit, you can only really get 3 GB of RAM before OS starts telling you no more.

    Another alternative might be to use memory mapped files to open the file:

    http://docs.python.org/library/mmap.html

    Edit: Ah you're on 64 bit .. possibly the cause is that you're running out of RAM+Swap .. fix would be to increase the amount of swap maybe.

    0 讨论(0)
提交回复
热议问题