I have a very large python script, 200K, that I would like to use as little memory as possible. It looks something like:
# a lot of data structures
r = [34, 78,
The advice on generator expressions and making use of modules is good. Premature optimization causes problems, but you should always spend a few minutes thinking about your design before sitting down to write code. Particularly if that code is meant to be reused.
Incidentally, you mention that you have a lot of data structures defined at the top of your script, which implies that they're all loaded into memory at the start. If this is a very large dataset, consider moving specific datasets to separate files, and loading it in only as needed. (using the csv
module, or numpy.loadtxt()
, etc)
Separate from using less memory, also look into ways to use memory more efficiently. For example, for large sets of numeric data, numpy arrays are a way of storing information that will provide better performance in your calculations. There is some slightly dated advice at http://wiki.python.org/moin/PythonSpeed/PerformanceTips