问题
Is there a reference for the memory size of Python data stucture on 32- and 64-bit platforms?
If not, this would be nice to have it on SO. The more exhaustive the better! So how many bytes are used by the following Python structures (depending on the len
and the content type when relevant)?
int
float
- reference
str
- unicode string
tuple
list
dict
set
array.array
numpy.array
deque
- new-style classes object
- old-style classes object
- ... and everything I am forgetting!
(For containers that keep only references to other objects, we obviously do not want to count the size of the item themselves, since it might be shared.)
Furthermore, is there a way to get the memory used by an object at runtime (recursively or not)?
回答1:
The recommendation from an earlier question on this was to use sys.getsizeof(), quoting:
>>> import sys
>>> x = 2
>>> sys.getsizeof(x)
14
>>> sys.getsizeof(sys.getsizeof)
32
>>> sys.getsizeof('this')
38
>>> sys.getsizeof('this also')
48
You could take this approach:
>>> import sys
>>> import decimal
>>>
>>> d = {
... "int": 0,
... "float": 0.0,
... "dict": dict(),
... "set": set(),
... "tuple": tuple(),
... "list": list(),
... "str": "a",
... "unicode": u"a",
... "decimal": decimal.Decimal(0),
... "object": object(),
... }
>>> for k, v in sorted(d.iteritems()):
... print k, sys.getsizeof(v)
...
decimal 40
dict 140
float 16
int 12
list 36
object 8
set 116
str 25
tuple 28
unicode 28
2012-09-30
python 2.7 (linux, 32-bit):
decimal 36
dict 136
float 16
int 12
list 32
object 8
set 112
str 22
tuple 24
unicode 32
python 3.3 (linux, 32-bit)
decimal 52
dict 144
float 16
int 14
list 32
object 8
set 112
str 26
tuple 24
unicode 26
2016-08-01
OSX, Python 2.7.10 (default, Oct 23 2015, 19:19:21) [GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.0.59.5)] on darwin
decimal 80
dict 280
float 24
int 24
list 72
object 16
set 232
str 38
tuple 56
unicode 52
回答2:
I've been happily using pympler for such tasks. It's compatible with many versions of Python -- the asizeof
module in particular goes back to 2.2!
For example, using hughdbrown's example but with from pympler import asizeof
at the start and print asizeof.asizeof(v)
at the end, I see (system Python 2.5 on MacOSX 10.5):
$ python pymp.py
set 120
unicode 32
tuple 32
int 16
decimal 152
float 16
list 40
object 0
dict 144
str 32
Clearly there is some approximation here, but I've found it very useful for footprint analysis and tuning.
回答3:
These answers all collect shallow size information. I suspect that visitors to this question will end up here looking to answer the question, "How big is this complex object in memory?"
There's a great answer here: https://goshippo.com/blog/measure-real-size-any-python-object/
The punchline:
import sys
def get_size(obj, seen=None):
"""Recursively finds size of objects"""
size = sys.getsizeof(obj)
if seen is None:
seen = set()
obj_id = id(obj)
if obj_id in seen:
return 0
# Important mark as seen *before* entering recursion to gracefully handle
# self-referential objects
seen.add(obj_id)
if isinstance(obj, dict):
size += sum([get_size(v, seen) for v in obj.values()])
size += sum([get_size(k, seen) for k in obj.keys()])
elif hasattr(obj, '__dict__'):
size += get_size(obj.__dict__, seen)
elif hasattr(obj, '__iter__') and not isinstance(obj, (str, bytes, bytearray)):
size += sum([get_size(i, seen) for i in obj])
return size
Used like so:
In [1]: get_size(1)
Out[1]: 24
In [2]: get_size([1])
Out[2]: 104
In [3]: get_size([[1]])
Out[3]: 184
If you want to know Python's memory model more deeply, there's a great article here that has a similar "total size" snippet of code as part of a longer explanation: https://code.tutsplus.com/tutorials/understand-how-much-memory-your-python-objects-use--cms-25609
回答4:
Try memory profiler. memory profiler
Line # Mem usage Increment Line Contents
==============================================
3 @profile
4 5.97 MB 0.00 MB def my_func():
5 13.61 MB 7.64 MB a = [1] * (10 ** 6)
6 166.20 MB 152.59 MB b = [2] * (2 * 10 ** 7)
7 13.61 MB -152.59 MB del b
8 13.61 MB 0.00 MB return a
回答5:
Also you can use guppy module.
>>> from guppy import hpy; hp=hpy()
>>> hp.heap()
Partition of a set of 25853 objects. Total size = 3320992 bytes.
Index Count % Size % Cumulative % Kind (class / dict of class)
0 11731 45 929072 28 929072 28 str
1 5832 23 469760 14 1398832 42 tuple
2 324 1 277728 8 1676560 50 dict (no owner)
3 70 0 216976 7 1893536 57 dict of module
4 199 1 210856 6 2104392 63 dict of type
5 1627 6 208256 6 2312648 70 types.CodeType
6 1592 6 191040 6 2503688 75 function
7 199 1 177008 5 2680696 81 type
8 124 0 135328 4 2816024 85 dict of class
9 1045 4 83600 3 2899624 87 __builtin__.wrapper_descriptor
<90 more rows. Type e.g. '_.more' to view.>
And:
>>> hp.iso(1, [1], "1", (1,), {1:1}, None)
Partition of a set of 6 objects. Total size = 560 bytes.
Index Count % Size % Cumulative % Kind (class / dict of class)
0 1 17 280 50 280 50 dict (no owner)
1 1 17 136 24 416 74 list
2 1 17 64 11 480 86 tuple
3 1 17 40 7 520 93 str
4 1 17 24 4 544 97 int
5 1 17 16 3 560 100 types.NoneType
回答6:
When you use the dir([object]) built-in function, you can the sizeof built-in function.
>>> a = -1
>>> a.__sizeof__()
24
来源:https://stackoverflow.com/questions/1331471/in-memory-size-of-a-python-structure