Looking up an object in a cache is cheaper than creating a new object. However, if you cached every int
, you would be wasting memory on objects that only get used once. Further, the larger the cache, the more expensive any lookup might become.
The choice of the interval [-5, 256] is somewhat arbitrary, but based on observations that the cache size is small enough, and those values used frequently enough, to justify caching them on startup, rather than only caching them on demand.
In your example, though, the compiler can see, while compiling the code, that 267
is used twice, and so can choose to add it to the cache, independently of the numbers that are cached on startup, before any code is compiled.
I can reproduce the shared use of 267
if I put the code in a file and execute it. If I enter each line individually in the interactive interpreter, separate objects are created: when a = 267
is executed, it isn't yet known that another use of 267 will follow, and its value isn't cached. Putting both assignments on the same line, though, does let caching happen:
>>> a = 267
>>> print(id(a))
4325771792
>>> b = 267
>>> print(id(b))
4326331344
>>> a = 267; b=267
>>> print(id(a), id(b))
4326333264 4326333264