So I\'m trying to implement multiprocessing in python where I wish to have a Pool of 4-5 processes running a method in parallel. The purpose of this is to run a total of tho
Since you're only returning state from the child process to the parent process, then using a shared array and explicity locks is overkill. You can use Pool.map
or Pool.starmap
to accomplish exactly what you need. For example:
from multiprocessing import Pool
class Adder:
"""I'm using this class in place of a monte carlo simulator"""
def add(self, a, b):
return a + b
def setup(x, y, z):
"""Sets up the worker processes of the pool.
Here, x, y, and z would be your global settings. They are only included
as an example of how to pass args to setup. In this program they would
be "some arg", "another" and 2
"""
global adder
adder = Adder()
def job(a, b):
"""wrapper function to start the job in the child process"""
return adder.add(a, b)
if __name__ == "__main__":
args = list(zip(range(10), range(10, 20)))
# args == [(0, 10), (1, 11), ..., (8, 18), (9, 19)]
with Pool(initializer=setup, initargs=["some arg", "another", 2]) as pool:
# runs jobs in parallel and returns when all are complete
results = pool.starmap(job, args)
print(results) # prints [10, 12, ..., 26, 28]
Not tested, but something like that should work. The array and lock are shared between processes.
from multiprocessing import Process, Array, Lock
def f(array, lock, n): #n is the dedicated location in the array
lock.acquire()
array[n]=-array[n]
lock.release()
if __name__ == '__main__':
size=100
arr=Array('i', [3,-7])
lock=Lock()
p = Process(target=f, args=(arr,lock,0))
q = Process(target=f, args=(arr,lock,1))
p.start()
q.start()
q.join()
p.join()
print(arr[:])
the documentation here https://docs.python.org/3.5/library/multiprocessing.html has plenty of examples to start with