I\'m using python to analyse some large files and I\'m running into memory issues, so I\'ve been using sys.getsizeof() to try and keep track of the usage, but it\'s behaviou
You can use array.nbytes for numpy arrays, for example:
>>> import numpy as np
>>> from sys import getsizeof
>>> a = [0] * 1024
>>> b = np.array(a)
>>> getsizeof(a)
8264
>>> b.nbytes
8192
The field nbytes will give you the size in bytes of all the elements of the array in a numpy.array
:
size_in_bytes = my_numpy_array.nbytes
Notice that this does not measures "non-element attributes of the array object" so the actual size in bytes can be a few bytes larger than this.
In python notebooks I often want to filter out 'dangling' numpy.ndarray
's, in particular the ones that are stored in _1
, _2
, etc that were never really meant to stay alive.
I use this code to get a listing of all of them and their size.
Not sure if locals()
or globals()
is better here.
import sys
import numpy
from humanize import naturalsize
for size, name in sorted(
(value.nbytes, name)
for name, value in locals().items()
if isinstance(value, numpy.ndarray)):
print("{:>30}: {:>8}".format(name, naturalsize(size)))