问题
I am doing some calculations on large collections of bytes. The process runs on chunks of bytes. I am trying to use parallel processing using multiprocessing for performance enhancement. Initially I tried to use pool.map but that only allows single argument, then I found about pool.starmap. But pool.starmap gives results only when all the processes have finished. I want results as they come (sort of). I am trying to use pool.imap which does provide results as processes finish but does not allow multiple arguments (my function requires 2 arguments). Also, the sequence of result is important.
Some sample code below:
pool = mp.Pool(processes=4)
y = []
for x in pool.starmap(f, zip(da, repeat(db))):
y.append(x)
The above code works, but only gives the results once all the processes have completed. I cannot see any progress. This is why I tried to use pool.imap, works well but with only single argument:
pool = mp.Pool(processes=4)
y = []
for x in pool.imap(f, da)):
y.append(x)
On multiple arguments raises the following exception:
TypeError: f() missing 1 required positional argument: 'd'
Looking for simple way to achieve all 3 requirements:
- parallel processing using multiple parameters/arguments
- manage to see progress while the processes are running
- ordered results.
Thanks!
回答1:
I can answer the first two question pretty quickly. I think you should be able to handle the third question after understanding the first two.
1. Parrallel Processing with Multiple Arguments
I'm not sure about the whole "starmap" equivalent but here's an alternative. What I've done in the past is condense my arguments into a single data object like a list. For example, if you want to pass three arguments to your map_function
, you could append those arguments into a list, and then use the list with the .map()
or .imap()
function.
def map_function(combo):
a = combo[0]
b = combo[1]
c = combo[2]
return a + b + c
if '__name__' == '__main__':
combo = []
combo[0] = arg_1
combo[1] = arg_2
combo[2] = arg_3
pool = Pool(processes=4)
pool.map(map_function, combo)
2. Tracking Progress
A good way to do this is using multiprocessing
's shared value. I actually asked this (almost) same exact question about a month ago. This allows you to manipulate the same variable from the different processes created by your map
function. For the sake of learning, I'm going to let you read and figure out the shared state solution on your own. If you're still having trouble after a few attempts, I'll be more than happy to help you, but I beleive that teaching yourself how to understand something is much more valuable than me giving you the answer.
Hope this helps!!
回答2:
I think this solution exactly meets your 3 requirements: https://stackoverflow.com/a/28382913/2379433
In short, p = Pool(); p.imap
will enable you to see progress and maintain order. If you want map
functions with multiple arguments, you can use a fork of multiprocessing
that provides better serialization and multiple arguments. See the link for an example.
回答3:
You can simulate starmap
using imap
via the functools.partial() function:
import functools
import multiprocessing as mp
def my_function(constant, my_list, optional_param=None):
print(locals())
with mp.Pool() as pool:
list(pool.imap(functools.partial(my_function,
2,
optional_param=3),
[1,2,3,4,5]))
Outputs:
$ python3 foo.py
{'optional_param': 3, 'my_list': 1, 'constant': 2}
{'optional_param': 3, 'my_list': 3, 'constant': 2}
{'optional_param': 3, 'my_list': 2, 'constant': 2}
{'optional_param': 3, 'my_list': 4, 'constant': 2}
{'optional_param': 3, 'my_list': 5, 'constant': 2}
来源:https://stackoverflow.com/questions/32515389/does-multiprocessing-pool-imap-has-a-variant-like-starmap-that-allows-for-mult