Yielding asyncio generator data back from event loop possible?

前端 未结 1 1815
無奈伤痛
無奈伤痛 2021-01-20 23:35

I would like to read from multiple simultanous HTTP streaming requests inside coroutines using httpx, and yield the data back to my non-async function running the event loop, ra

相关标签:
1条回答
  • 2021-01-21 00:23

    Normally you should just make collect_data async, and use async code throughout - that's how asyncio was designed to be used. But if that's for some reason not feasible, you can iterate an async iterator manually by applying some glue code:

    def iter_over_async(ait, loop):
        ait = ait.__aiter__()
        async def get_next():
            try:
                obj = await ait.__anext__()
                return False, obj
            except StopAsyncIteration:
                return True, None
        while True:
            done, obj = loop.run_until_complete(get_next())
            if done:
                break
            yield obj
    

    The way the above works is by providing an async closure that keeps retrieving the values from the async iterator using the __anext__ magic method and returning the objects as they arrive. This async closure is invoked with run_until_complete() in a loop inside an ordinary sync generator. (The closure actually returns a pair of done indicator and actual object in order to avoid propagating StopAsyncIteration through run_until_complete, which might be unsupported.)

    With this in place, you can make your execute_tasks an async generator (async def with yield) and iterate over it using:

    for chunk in iter_over_async(execute_tasks(urls), loop):
        ...
    

    Just note that this approach is incompatible with asyncio.run, and might cause problems later down the line.

    0 讨论(0)
提交回复
热议问题