Multiprocessing in a pipeline done right

后端 未结 6 1194
长情又很酷
长情又很酷 2021-02-04 05:06

I\'d like to know how multiprocessing is done right. Assuming I have a list [1,2,3,4,5] generated by function f1 which is written to a Queue

6条回答
  •  [愿得一人]
    2021-02-04 05:48

    I use concurent.futures and three pools, which are connected together via future.add_done_callback. Then I wait for the whole process to end by calling shutdown on each pool.

    from concurrent.futures import ProcessPoolExecutor
    import time
    import random
    
    
    def worker1(arg):
        time.sleep(random.random())
        return arg
    
    
    def pipe12(future):
        pool2.submit(worker2, future.result()).add_done_callback(pipe23)
    
    
    def worker2(arg):
        time.sleep(random.random())
        return arg
    
    
    def pipe23(future):
        pool3.submit(worker3, future.result()).add_done_callback(spout)
    
    
    def worker3(arg):
        time.sleep(random.random())
        return arg
    
    
    def spout(future):
        print(future.result())
    
    
    if __name__ == "__main__":
        __spec__ = None  # Fix multiprocessing in Spyder's IPython
        pool1 = ProcessPoolExecutor(2)
        pool2 = ProcessPoolExecutor(2)
        pool3 = ProcessPoolExecutor(2)
        for i in range(10):
            pool1.submit(worker1, i).add_done_callback(pipe12)
        pool1.shutdown()
        pool2.shutdown()
        pool3.shutdown()
    

提交回复
热议问题