More and more features of Python move to be \"lazy executable\", like generator expressions and other kind of iterators. Sometimes, however, I see myself wanting to roll a o
There is one obvious way to do it, and that is the way you should do it. There is no excuse for doing it a clever way.
a = open("numbers.txt", "w")
for i in xrange(100):
a.write("%d " % i)
d.close()
Lazy execution gives you a serious benefit: It allows you to pass a sequence to another piece of code without having to hold the entire thing in memory. It is for the creation of efficient sequences as data types.
In this case, you do not want lazy execution. You want execution. You can just ... execute. With a for
loop.
There are many accumulators
which have the effect of consuming the whole iterable they're given, such as min
or max
-- but even they don't ignore entirely the results yielded in the process (min
and max
, for example, will raise an exception if some of the results are complex numbers). I don't think there's a built-in accumulator that does exactly what you want -- you'll have to write (and add to your personal stash of tiny utility function) a tiny utility function such as
def consume(iterable):
for item in iterable: pass
The main reason, I guess, is that Python has a for
statement and you're supposed to use it when it fits like a glove (i.e., for the cases you'd want consume
for;-).
BTW, a.write
returns None
, which is falsish, so any
will actually consume it (and a.writelines
will do even better!). But I realize you were just giving that as an example;-).
It is 2019 -
and this is a question from 2010 that keeps showing up. A recent thread in one of Python's mailing lists spammed over 70 e-mails on this subject, and they refused again to add a consume
call to the language.
On that thread, the most efficient mode to that actually showed up, and it is far from being obvious, so I am posting it as the answer here:
import deque
consume = deque(maxlen=0).extend
And then use the consume
callable to process generator expressions.
It turns out the deque
native code in cPython actually is optimized for the maxlen=0
case, and will just consume the iterable.
The any
and all
calls I mentioned in the question should be equally as efficient, but one has to worry about the expression truthiness in order for the iterable to be consumed.
I see this still may be controversial, after all, an explicit two line for loop can handle this - I remembered this question because I just made a commit where I create some threads, start then, and join then back - without a consume
callable, that is 4 lines with mostly boiler plate, and without benefiting from cycling through the iterable in native code:
https://github.com/jsbueno/extracontext/blob/a5d24be882f9aa18eb19effe3c2cf20c42135ed8/tests/test_thread.py#L27
If I wanted to do this specific example, I'd write
for i in xrange(100): a.write('%d ' % i)
If I often needed to consume an iterator for its effect, I'd define
def for_effect(iterable):
for _ in iterable:
pass