multiprocessing returns “too many open files” but using `with…as` fixes it. Why?

后端 未结 3 2073
误落风尘
误落风尘 2021-02-05 05:31

I was using this answer in order to run parallel commands with multiprocessing in Python on a Linux box.

My code did something like:

import multiproce         


        
3条回答
  •  青春惊慌失措
    2021-02-05 06:13

    This can happen when you use numpy.load too, make sure close those files too, or avoid using it and use pickle or torch.save torch.load etc.

提交回复
热议问题