Is it possible to take a blocking function such as work
and have it run concurrently in a ProcessPoolExecutor
that has more than one worker?
await loop.run_in_executor(executor, work)
blocks the loop until work
completes, as a result you only have one function running at a time.
To run jobs concurrently, you could use asyncio.as_completed
:
async def producer():
tasks = [loop.run_in_executor(executor, work) for _ in range(num_jobs)]
for f in asyncio.as_completed(tasks, loop=loop):
results = await f
await queue.put(results)