Making async for loops in Python

后端 未结 3 904
耶瑟儿~
耶瑟儿~ 2021-01-02 05:33

The following code outputs as follows:

1 sec delay, print \"1\", 
1 sec delay, print \"2\", 
1 sec delay, print \"1\", 
1 sec delay, print \"2\"
相关标签:
3条回答
  • 2021-01-02 06:03

    Looking at the desired output, it seems that the goal is to leave the individual iteration as it is - i.e. run first and second sequentially - but execute both loop iterations in parallel.

    Assuming you only want to modify main(), it could be achieved like this:

    async def main():
        async def one_iteration():
            result = await first()
            print(result)
            result2 = await second()
            print(result2)
        coros = [one_iteration() for _ in range(2)]
        await asyncio.gather(*coros)
    

    Instead of iterating in sequence, the above creates a coroutine for each iteration task, and uses asyncio.gather to execute all the iterations in parallel.

    Note that simply creating a coroutine doesn't start executing it, so a large number of coros won't block the event loop.

    0 讨论(0)
  • 2021-01-02 06:10

    To run the two functions simultaneously you can use gather. However, the results will be provided to you in the order you provide them. So for example if you do

    results = await asyncio.gather(first(), second())
    

    Then you will get [the result of first(), the result of second()] back. If you want to do something whenever each one returns then you should use Tasks explicitly and add callbacks.

    0 讨论(0)
  • 2021-01-02 06:20

    With the aysncio library you can use aysncio.gather()

    loop.run_until_complete(asyncio.gather(
      first(),
      second()
    ))
    

    This can come in handy if you are also sending HTTP requests in parallel:

    loop.run_until_complete(asyncio.gather(
      request1(),
      request2()
    ))
    
    0 讨论(0)
提交回复
热议问题