My python code process memory increases dynamically as it stores dynamic data in list, dictionary and tuples wherever necessary. Though all those dynamic data is cleared phy
How big are we talking? Python itself takes up some amount of memory.. up to maybe 30 or 40 MB I believe. If it is bigger than that and not getting collected, you have a memory leak. Only garbage with no references can be collected, somehow your extra stuff is still referenced. Do a memory profile and see what is going on.
It's very hard, in general, for a process to "give memory back to the OS" (until the process terminates and the OS gets back all the memory, of course) because (in most implementation) what malloc
returns is carved out of big blocks for efficiency, but the whole block can't be given back if any part of it is still in use -- so, most C standard libraries don't even try.
For a decent discussion in a Python context, see e.g. here. Evan Jones fixed some Python-specific issues as described here and here, but his patch is in the trunk since Python 2.5, so the problems you're observing are definitely with the system malloc package, not with Python per se. A 2.6-specific explanation is here and here.
A SO thread is here, where Hugh Allen in his answer quotes Firefox programmers to the extend that Mac OS X is a system where it's basically impossible for a process to give memory back to the OS.
So, only by terminating a process can you be sure to release its memory. For example, a long-running server, once in a while, could snapshot its state to disk and shut down (with a tiny watchdog process, system or custom, watching over it and restarting it). If you know that the next operation will take a lot of memory for a short time, often you can os.fork
, do the memory-hungry work in the child process, and have results (if any) returned to the parent process via a pipe as the child process terminates. And so on, and so forth.