I want to change the following code
for directory, dirs, files in os.walk(directory_1):
do_something()
for directory, dirs, files in os.walk(directory_2
A example of code:
from itertools import chain
def generator1():
for item in 'abcdef':
yield item
def generator2():
for item in '123456':
yield item
generator3 = chain(generator1(), generator2())
for item in generator3:
print item
In Python (3.5 or greater) you can do:
def concat(a, b):
yield from a
yield from b
Here it is using a generator expression with nested for
s:
a = range(3)
b = range(5)
ab = (i for it in (a, b) for i in it)
assert list(ab) == [0, 1, 2, 0, 1, 2, 3, 4]
I would say that, as suggested in comments by user "wjandrea", the best solution is
def concat_generators(*args):
for gen in args:
yield from gen
It does not change the returned type and is really pythonic.
If you want to keep the generators separate but still iterate over them at the same time you can use zip():
NOTE: Iteration stops at the shorter of the two generators
For example:
for (root1, dir1, files1), (root2, dir2, files2) in zip(os.walk(path1), os.walk(path2)):
for file in files1:
#do something with first list of files
for file in files2:
#do something with second list of files
Lets say that we have to generators (gen1 and gen 2) and we want to perform some extra calculation that requires the outcome of both. We can return the outcome of such function/calculation through the map method, which in turn returns a generator that we can loop upon.
In this scenario, the function/calculation needs to be implemented via the lambda function. The tricky part is what we aim to do inside the map and its lambda function.
General form of proposed solution:
def function(gen1,gen2):
for item in map(lambda x, y: do_somethin(x,y), gen1, gen2):
yield item