Combining asyncio with a multi-worker ProcessPoolExecutor

后端 未结 2 905
北海茫月
北海茫月 2021-02-04 17:02

Is it possible to take a blocking function such as work and have it run concurrently in a ProcessPoolExecutor that has more than one worker?



        
相关标签:
2条回答
  • 2021-02-04 17:30

    The problem is in the producer. Instead of allowing the jobs to run in the background, it waits for each job to finish, thus serializing them. If you rewrite producer to look like this (and leave consumer unchanged), you get the expected 1s duration:

    async def producer():
        for i in range(num_jobs):
            fut = loop.run_in_executor(executor, work)
            fut.add_done_callback(lambda f: queue.put_nowait(f.result()))
    
    0 讨论(0)
  • 2021-02-04 17:32

    await loop.run_in_executor(executor, work) blocks the loop until work completes, as a result you only have one function running at a time.

    To run jobs concurrently, you could use asyncio.as_completed:

    async def producer():
        tasks = [loop.run_in_executor(executor, work) for _ in range(num_jobs)]
        for f in asyncio.as_completed(tasks, loop=loop):
            results = await f
            await queue.put(results)
    
    0 讨论(0)
提交回复
热议问题