Very large matrices using Python and NumPy

后端 未结 11 1783
难免孤独
难免孤独 2020-11-22 13:51

NumPy is an extremely useful library, and from using it I\'ve found that it\'s capable of handling matrices which are quite large (10000 x 10000) easily, but begins to strug

相关标签:
11条回答
  • 2020-11-22 14:30

    As far as I know about numpy, no, but I could be wrong.

    I can propose you this alternative solution: write the matrix on the disk and access it in chunks. I suggest you the HDF5 file format. If you need it transparently, you can reimplement the ndarray interface to paginate your disk-stored matrix into memory. Be careful if you modify the data to sync them back on the disk.

    0 讨论(0)
  • 2020-11-22 14:33

    You should be able to use numpy.memmap to memory map a file on disk. With newer python and 64-bit machine, you should have the necessary address space, without loading everything into memory. The OS should handle only keep part of the file in memory.

    0 讨论(0)
  • 2020-11-22 14:35

    numpy.arrays are meant to live in memory. If you want to work with matrices larger than your RAM, you have to work around that. There are at least two approaches you can follow:

    1. Try a more efficient matrix representation that exploits any special structure that your matrices have. For example, as others have already pointed out, there are efficient data structures for sparse matrices (matrices with lots of zeros), like scipy.sparse.csc_matrix.
    2. Modify your algorithm to work on submatrices. You can read from disk only the matrix blocks that are currently being used in computations. Algorithms designed to run on clusters usually work blockwise, since the data is scatted across different computers, and passed by only when needed. For example, the Fox algorithm for matrix multiplication (PDF file).
    0 讨论(0)
  • 2020-11-22 14:36

    Make sure you're using a 64-bit operating system and a 64-bit version of Python/NumPy. Note that on 32-bit architectures you can address typically 3GB of memory (with about 1GB lost to memory mapped I/O and such).

    With 64-bit and things arrays larger than the available RAM you can get away with virtual memory, though things will get slower if you have to swap. Also, memory maps (see numpy.memmap) are a way to work with huge files on disk without loading them into memory, but again, you need to have a 64-bit address space to work with for this to be of much use. PyTables will do most of this for you as well.

    0 讨论(0)
  • 2020-11-22 14:37

    Stefano Borini's post got me to look into how far along this sort of thing already is.

    This is it. It appears to do basically what you want. HDF5 will let you store very large datasets, and then access and use them in the same ways NumPy does.

    0 讨论(0)
提交回复
热议问题