multiprocessing returns “too many open files” but using `with…as` fixes it. Why?

后端 未结 3 2058
误落风尘
误落风尘 2021-02-05 05:31

I was using this answer in order to run parallel commands with multiprocessing in Python on a Linux box.

My code did something like:

import multiproce         


        
相关标签:
3条回答
  • 2021-02-05 05:52

    You're creating new processes inside a loop, and then forgetting to close them once you're done with them. As a result, there comes a point where you have too many open processes. This is a bad idea.

    You could fix this by using a context manager which automatically calls pool.terminate, or manually call pool.terminate yourself. Alternatively, why don't you create a pool outside the loop just once, and then send tasks to the processes inside?

    pool = multiprocessing.Pool(nprocess) # initialise your pool
    for nprocess in process_per_cycle:
        ...       
        pool.map(cycle, offsets) # delegate work inside your loop
    
    pool.close() # shut down the pool
    

    For more information, you could peruse the multiprocessing.Pool documentation.

    0 讨论(0)
  • 2021-02-05 06:10

    It is context manger. Using with ensures that you are opening and closing files properly. To understand this in detail, I'd recommend this article https://jeffknupp.com/blog/2016/03/07/python-with-context-managers/

    0 讨论(0)
  • 2021-02-05 06:13

    This can happen when you use numpy.load too, make sure close those files too, or avoid using it and use pickle or torch.save torch.load etc.

    0 讨论(0)
提交回复
热议问题