问题
I have huge array objects that are pickled with the python pickler.
I am trying to unpickle them and reading out the data in a for loop. Every time I am done reading and assesing, I delete all the references to those objects.
After deletion, I even call gc.collect()
along with time.sleep()
to see if the heap memory reduces.
The heap memory doesn't reduce pointing to the fact that, the data is still referenced somewhere within the pickle loading. After 15 datafiles(I got 250+ files to process, 1.6GB each) I hit the memory error.
I have seen many other questions here, pointing out a memory leak issue which was supposedly solved. I don't understand what is exactly happening in my case.
回答1:
Python memory management does not free memory to OS till the process is running.
Running the for loop with a subprocess to call the script helped me solved the issue. Thanks for the feedbacks.
来源:https://stackoverflow.com/questions/61091634/memory-leak-on-pickle-inside-a-for-loop-forcing-a-memory-error