问题
I am looking for an implementation of the fork-join model for Python. As Java's ForkJoinPool, it should allow to split (fork) the work of a task into several sub tasks recursively. Once the sub tasks are completed, the results are joined and returned. Ideally, it should support threads and processes similar to the ThreadPoolExecutor and ProcessPoolExecutor in concurrent.futures, but threads are more important for now. It must allow to limit the number of threads (I want to have one thread per core). I am aware that this will only be useful if the code releases the GIL.
Example from Wikipedia to clarify fork-join model:
solve(problem):
if problem is small enough:
solve problem directly (sequential algorithm)
else:
for part in subdivide(problem)
fork subtask to solve(part)
join all subtasks spawned in previous loop
return combined results
Is there such a library in Python? I could not find one.
回答1:
I figured that what you want is to collect the result, multiprocessing.starmap() might be the choice, here goes the example
import multiprocessing as mp
def func(x, y):
return x + y
l = list()
with mp.Pool(mp.cpu_count()) as p:
l = p.starmap(func, [(1,2), (2,3), (3,4)])
print(l) # result in [3, 5, 7]
来源:https://stackoverflow.com/questions/54938449/fork-join-model-implementation-for-python-equivalent-to-javas-forkjoinpool