Does PEP 412 make __slots__ redundant?

隐身守侯 提交于 2019-12-04 23:54:09

No, PEP 412 does not make __slots__ redundant.


First, Armin Rigo is right that you're not measuring it properly. What you need to measure is the size of the object, plus the values, plus the __dict__ itself (for NoSlots only) and the keys (for NoSlots only).

Or you could do what he suggests:

cls = Slots if len(sys.argv) > 1 else NoSlots
def f():
    tracemalloc.start()
    objs = [cls() for _ in range(100000)]
    print(tracemalloc.get_traced_memory())
f()

When I run this on 64-bit CPython 3.4 on OS X, I get 8824968 for NoSlots and 25624872 for Slots. So, it looks like a NoSlots instance takes 88 bytes, while a Slots instance takes 256 bytes.


How is this possible?

Because there are still two differences between __slots__ and a key-split __dict__.

First, the hash tables used by dictionaries are kept below 2/3rds full, and they grow exponentially and have a minimum size, so you're going to have some extra space. And it's not hard to work out how much space by looking at the nicely-commented source: you're going to have 8 hash buckets instead of 5 slots pointers.

Second, the dictionary itself isn't free; it has a standard object header, a count, and two pointers. That might not sound like a lot, but when you're talking about an object that's only got a few attributes (note that most objects only have a few attributes…), the dict header can make as much difference as the hash table.

And of course in your example, the values, so the only cost involved here is the object itself, plus the the 5 slots or 8 hash buckets and dict header, so the difference is pretty dramatic. In real life, __slots__ will rarely be that much of a benefit.


Finally, notice that PEP 412 only claims:

Benchmarking shows that memory use is reduced by 10% to 20% for object-oriented programs

Think about where you use __slots__. Either the savings are so huge that not using __slots__ would be ridiculous, or you really need to squeeze out that last 15%. Or you're building an ABC or other class that you expect to be subclassed by who-knows-what and the subclasses might need the savings. At any rate, in those cases, the fact that you get half the benefit without __slots__, or even two thirds the benefit, is still rarely going to be enough; you'll still need to use __slots__.

The real win is in the cases where it isn't worth using __slots__; you'll get a small benefit for free.

(Also, there are definitely some programmers who overuse the hell out of __slots__, and maybe this change can convince some of them to put their energy into micro optimizing something else not quite as irrelevant, if you're lucky.)

The problem is sys.getsizeof(), which rarely returns what you expect. For example in this case it counts the "size" of an object without accounting for the size of its __dict__. I suggest you retry by measuring the real memory usage of creating 100'000 instances.

Note also that the Python 3.3 behavior was inspired by PyPy, in which __slots__ makes no difference, so I would expect it to make no difference in Python 3.3 too. As far as I can tell, __slots__ is almost never of any use now.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!