问题
I tried to combine blocking tasks and non-blocking (I/O bound) tasks using ProcessPoolExecutor
and found it's behavior pretty unexpected.
class BlockingQueueListener(BaseBlockingListener):
def run(self):
# Continioulsy listening a queue
blocking_listen()
class NonBlockingListener(BaseNonBlocking):
def non_blocking_listen(self):
while True:
await self.get_message()
def run(blocking):
blocking.run()
if __name__ == "__main__":
loop = asyncio.get_event_loop()
executor = ProcessPoolExecutor()
blocking = BlockingQueueListener()
non_blocking = NonBlockingListener()
future = loop.run_in_executor(executor, run(blocking))
loop.run_until_complete(
asyncio.gather(
non_blocking.main(),
future
)
)
I was expecting that both tasks will have control concurrently, but blocking task started in ProcessPoolExecutor
blocks and never return control. How could it happen? What the proper way to combine normal coroutines and futures started in multiprocessing executor?
回答1:
This line:
future = loop.run_in_executor(executor, run(blocking))
Will actually run the blocking function and give its result to the executor.
According to the documentation, you need to pass the function explicitly followed by its arguments.
future = loop.run_in_executor(executor, run, blocking)
来源:https://stackoverflow.com/questions/49978320/asyncio-run-in-executor-using-processpoolexecutor