问题
Given a list()
of running multiprocessing.Process
-instances, how can I join on all of them and return as soon as one exits without a Process.join
-timeout and looping?
Example
from multiprocessing import Process
from random import randint
from time import sleep
def run():
sleep(randint(0,5))
running = [ Process(target=run) for i in range(10) ]
for p in running:
p.start()
How can I block until at least one Process
in p
exits?
What I don't want to do is:
exit = False
while not exit:
for p in running:
p.join(0)
if p.exitcode is not None:
exit = True
break
回答1:
You can use multiprocessing.connection.wait() (Python 3.3+) to wait on several Process.sentinel
s at once. A sentinel will become ready, as soon a Process exits and hence unblock the connection.wait()
.
multiprocessing.connection.wait(object_list, timeout=None)
Wait till an object in object_list is ready. Returns the list of those objects in object_list which are ready. If timeout is a float then the call blocks for at most that many seconds. If timeout is None then it will block for an unlimited period. A negative timeout is equivalent to a zero timeout.
For both Unix and Windows, an object can appear in object_list if it is
a readable Connection object;
a connected and readable socket.socket object; or
the sentinel attribute of a Process object.
A connection or socket object is ready when there is data available to be read from it, or the other end has been closed. ...
from multiprocessing import Process, connection, current_process
from random import randint
from time import sleep
from datetime import datetime
def run():
sleep(randint(2,10))
print(f"{datetime.now()} {current_process().name} exiting")
if __name__ == '__main__':
pool = [Process(target=run) for _ in range(4)]
for p in pool:
p.start()
print(f"{datetime.now()} {current_process().name} waiting")
connection.wait(p.sentinel for p in pool)
print(f"{datetime.now()} {current_process().name} unblocked")
Example Output:
2019-07-22 21:54:07.061989 MainProcess waiting
2019-07-22 21:54:09.062498 Process-3 exiting
2019-07-22 21:54:09.063565 MainProcess unblocked
2019-07-22 21:54:09.064391 Process-4 exiting
2019-07-22 21:54:14.068392 Process-2 exiting
2019-07-22 21:54:17.062045 Process-1 exiting
Process finished with exit code 0
回答2:
There really isn't a way to do what you want here exactly as specified- that's just not the way the API is set up. If you can take it up a level to where you create the list of processes though, there are a number of excellent solutions.
Probably the best is to use multiprocessing.Pool.imap_unordered(). This will take a function and an iterable of inputs, create a bunch of processes, and feed the inputs to the processes. It returns an iterable that the next
method will wait until a value is ready, and then return each of those as it becomes available.
If you can't work your problem into function + inputs, the next solution is to use some synchronization primitive. For what I am guessing you want to accomplish, I would use a semaphore-
sem = Semaphore(0)
def build_proc(the_sem):
do_some_work
the_sem.release()
myprocs = [buld_proc(sem) for _ in range(10)]
# in your code-
start_procs(myprocs)
done = 0
while done < len(myprocs):
sem.acquire()
do_post_processing()
If you really don't need a loop, an Event would work as well, just wait for the first process to set it. If you really can't modify the function that creates the processes in any way, the final solution I could imagine is (pretty bad haha)- use a threadpool to set up a pool of waiters for each process.
from concurrent.futures import ThreadPoolExecutor, wait, FIRST_COMPLETED
def waiter(proc):
proc.join()
with ThreadPoolExecutor(max_workers=5) as executor:
futures = [executor.submit(waiter, p) for p in processes]
# this will return as soon as one completes
results = wait(futures, return_when=FIRST_COMPLETED)
来源:https://stackoverflow.com/questions/57152073/how-to-join-a-list-of-multiprocessing-process-at-the-same-time