The combination of coroutines and resource acquisition seems like it could have some unintended (or unintuitive) consequences.
The basic question is whether or not somet
I don't really understand what conflict you're asking about, nor the problem with the example: it's fine to have two coexisting, independent handles to the same file.
One thing I didn't know that I learned in response to your question it that there is a new close() method on generators:
close()
raises a newGeneratorExit
exception inside the generator to terminate the iteration. On receiving this exception, the generator’s code must either raiseGeneratorExit
orStopIteration
.
close()
is called when a generator is garbage-collected, so this means the generator’s code gets one last chance to run before the generator is destroyed. This last chance means thattry...finally
statements in generators can now be guaranteed to work; thefinally
clause will now always get a chance to run. This seems like a minor bit of language trivia, but using generators andtry...finally
is actually necessary in order to implement thewith
statement described by PEP 343.http://docs.python.org/whatsnew/2.5.html#pep-342-new-generator-features
So that handles the situation where a with
statement is used in a generator, but it yields in the middle but never returns—the context manager's __exit__
method will be called when the generator is garbage-collected.
Edit:
With regards to the file handle issue: I sometimes forget that there exist platforms that aren't POSIX-like. :)
As far as locks go, I think Rafał Dowgird hits the head on the nail when he says "You just have to be aware that the generator is just like any other object that holds resources." I don't think the with
statement is really that relevant here, since this function suffers from the same deadlock issues:
def coroutine():
lock.acquire()
yield 'spam'
yield 'eggs'
lock.release()
generator = coroutine()
generator.next()
lock.acquire() # whoops!