I have a large series of raster datasets representing monthly rainfall over several decades. I\'ve written a script in Python that loops over each raster and does the following:
You don't need to concern youself with memory management, especially not with the garbage collector that has a very specific task that you most likely don't even use. Python will always collect the memory it can and reuse it.
There are just two reasons for your problem: Either the data you try to load is too much to fit into memory or your calculations store data somewhere (a list, dict, something persistent between iterations) and that storage grows and grows. Memory profilers can help finding that.