Can I somehow share an asynchronous queue with a subprocess?

前端 未结 3 951
被撕碎了的回忆
被撕碎了的回忆 2020-12-24 00:32

I would like to use a queue for passing data from a parent to a child process which is launched via multiprocessing.Process. However, since the parent process u

3条回答
  •  礼貌的吻别
    2020-12-24 01:23

    The multiprocessing library isn't particularly well-suited for use with asyncio, unfortunately. Depending on how you were planning to use the multiprocessing/multprocessing.Queue, however, you may be able to replace it completely with a concurrent.futures.ProcessPoolExecutor:

    import asyncio
    from concurrent.futures import ProcessPoolExecutor
    
    
    def do_proc_work(stuff, stuff2):  # This runs in a separate process
        return stuff + stuff2
    
    @asyncio.coroutine
    def do_work():
        out = yield from loop.run_in_executor(ProcessPoolExecutor(max_workers=1),
                                              do_proc_work, 1, 2)
        print(out)
    
    if __name__  == "__main__":
        loop = asyncio.get_event_loop()
        loop.run_until_complete(do_work())
    

    Output:

    3
    

    If you absolutely need a multiprocessing.Queue, It seems like it will behave ok when combined with ProcessPoolExecutor:

    import asyncio
    import time
    import multiprocessing
    from concurrent.futures import ProcessPoolExecutor, ThreadPoolExecutor
    
    
    def do_proc_work(q, stuff, stuff2):
        ok = stuff + stuff2
        time.sleep(5) # Artificial delay to show that it's running asynchronously
        print("putting output in queue")
        q.put(ok)
    
    @asyncio.coroutine
    def async_get(q):
        """ Calls q.get() in a separate Thread. 
    
        q.get is an I/O call, so it should release the GIL.
        Ideally there would be a real non-blocking I/O-based 
        Queue.get call that could be used as a coroutine instead 
        of this, but I don't think one exists.
    
        """
        return (yield from loop.run_in_executor(ThreadPoolExecutor(max_workers=1), 
                                               q.get))
    
    @asyncio.coroutine
    def do_work(q):
        loop.run_in_executor(ProcessPoolExecutor(max_workers=1),
                             do_proc_work, q, 1, 2)
        coro = async_get(q) # You could do yield from here; I'm not just to show that it's asynchronous
        print("Getting queue result asynchronously")
        print((yield from coro))
    
    if __name__  == "__main__":
        m = multiprocessing.Manager()
        q = m.Queue() # The queue must be inherited by our worker, it can't be explicitly passed in
        loop = asyncio.get_event_loop()
        loop.run_until_complete(do_work(q))
    

    Output:

    Getting queue result asynchronously
    putting output in queue
    3
    

提交回复
热议问题