I have the following class:
class C1:
STORE = []
TYPE = []
ITEMS = []
PRICE = []
def __init__(self,STORE,TYPE,ITEMS,PRICE):
se
You have a 800 x 30,000 matrix. That's 24 000 000 elements per array. That's already at about 100 MB of space if they're floats, but more because of object overhead. And you have six of these beasts?
If 1.8 GB is too much then you'll have to store less. Sorry, but this is why real number crunching can be hard. Make sure you only have the data you need, and that's it.
If most of that matrix is empty then I'd suggest looking at some sparse matrix libraries. SciPy/NumPy have the most common ones, but I'm sure someone else provides something workable with Python 2.3. Maybe an old NumPy?
It's hard to say something without knowing more about your algorithm, but perhaps would http://docs.python.org/library/struct.html be an option? Or cyton? Pyrex?
Using Python 2.3 is going to limit your options (including excluding the option of going to 64-bit). That's also the main reason the memory is not being released back to the OS: the internal object allocator in CPython didn't gain the ability to return no longer used memory to the OS until 2.5.
If you can, try running the algorithm on 2.7 and see what gains you're able to achieve purely by using a more recent version of the interpreter (or what compatibility problems would arise in such a migration).
And, as others have suggested, optimise your data structures. Check the algorithmic complexity of the operations you perform regularly, and see if there is a way to convert O(n*n) operations to O(n*logn) and O(n) to O(logn) or O(1).
Even if the underlying data structures can't change, you may be able to use the bisect
module to speed up some operations on your lists.
You might try the array
module, it was around in Python 2.3. Other than that, you may want to look into using a proper database for this stuff.
Items are alphanumerically ordered [...]
With all this data in a class, I run 'reports' with some 'complex' computation involved.
Are there data structures other than lists which would consume less memory and improve my performance?
I'm just guessing at your algorithms here: linear-time searching? If so, using an OrderedDict may improve performance greatly.
That won't solve the memory issue, though; consider using a proper database package instead, e.g. SQLite + SQLAlchemy or plain old bsddb with B-trees.