How much memory in numpy array? Is RAM a limiting factor?

前端 未结 2 1782
一整个雨季
一整个雨季 2020-12-28 10:21

I\'m using numpy to create a cube array with sides of length 100, thus containing 1 million entries total. For each of the million entries, I am inserting a 100x100 matrix w

相关标签:
2条回答
  • 2020-12-28 10:57

    A couple points:

    • The size in memory of numpy arrays is easy to calculate. It's simply the number of elements times the data size, plus a small constant overhead. For example, if your cube.dtype is int64, and it has 1,000,000 elements, it will require 1000000 * 64 / 8 = 8,000,000 bytes (8Mb).
    • However, as @Gabe notes, 100 * 100 * 1,000,000 doubles will require about 80 Gb.
    • This will not cause anything to "break", per-se, but operations will be ridiculously slow because of all the swapping your computer will need to do.
    • Your loops will not do what you expect. Instead of replacing the element in cube, element = matrix will simply overwrite the element variable, leaving the cube unchanged. The same goes for the entry = random.rand() * 100.
    • Instead, see: http://docs.scipy.org/doc/numpy/reference/arrays.nditer.html#modifying-array-values
    0 讨论(0)
  • 2020-12-28 11:11

    for the "inner" part of your function, look at the numpy.random module

    import numpy as np
    matrix = np.random.random((100,100))*100
    
    0 讨论(0)
提交回复
热议问题