It\'s a common idiom in python to use context manager to automatically close files:
with open(\'filename\') as my_file:
# do something with my_file
# my_fil
Is it safe to combine 'with' and 'yield' in python?
I don't think you should do this.
Let me demonstrate making some files:
>>> for f in 'abc':
... with open(f, 'w') as _: pass
Convince ourselves that the files are there:
>>> for f in 'abc':
... with open(f) as _: pass
And here's a function that recreates your code:
def gen_abc():
for f in 'abc':
with open(f) as file:
yield file
Here it looks like you can use the function:
>>> [f.closed for f in gen_abc()]
[False, False, False]
But let's create a list comprehension of all of the file objects first:
>>> l = [f for f in gen_abc()]
>>> l
[<_io.TextIOWrapper name='a' mode='r' encoding='cp1252'>, <_io.TextIOWrapper name='b' mode='r' encoding='cp1252'>, <_io.TextIOWrapper name='c' mode='r' encoding='cp1252'>]
And now we see they are all closed:
>>> c = [f.closed for f in l]
>>> c
[True, True, True]
This only works until the generator closes. Then the files are all closed.
I doubt that is what you want, even if you're using lazy evaluation, your last file will probably be closed before you're done using it.