Is it possible to limit the number of coroutines running corcurrently in asyncio?

后端 未结 3 2167
臣服心动
臣服心动 2021-02-12 15:15

I already wrote my script using asyncio but found that the number of coroutines running simultaneously is too large and it often ends up hanging around.

So I would like

相关标签:
3条回答
  • 2021-02-12 15:17

    You can wrap your gather and enforce a Semaphore:

    import asyncio
    
    async def semaphore_gather(num, coros, return_exceptions=False):
        semaphore = asyncio.Semaphore(num)
    
        async def _wrap_coro(coro):
            async with semaphore:
                return await coro
    
        return await asyncio.gather(
            *(_wrap_coro(coro) for coro in coros), return_exceptions=return_exceptions
        )
    
    # async def a():
    #     return 1
    
    # print(asyncio.run(semaphore_gather(10, [a() for _ in range(100)])))
    # [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
    
    0 讨论(0)
  • 2021-02-12 15:28

    I can suggest using asyncio.BoundedSemaphore.

    import asyncio
    
    async def my_func(player, asyncio_semaphore):
        async with asyncio_semaphore:
            # do stuff
    
    async def main():
        asyncio_semaphore = asyncio.BoundedSemaphore(200)
        jobs = []
        for i in range(12000):
            jobs.append(asyncio.ensure_future(my_func(players[i], asyncio_semaphore)))
        await asyncio.gather(*jobs)
    
    if __name__ == '__main__':
        loop = asyncio.get_event_loop()
        loop.set_debug(True)
        loop.run_until_complete(main())
    

    This way, only 200 concurrent tasks can acquire semaphore and use system resources while 12000 tasks are at hand.

    0 讨论(0)
  • 2021-02-12 15:31

    You might want to consider using aiostream.stream.map with the task_limit argument:

    from aiostream import stream, pipe
    
    async def main():
        xs = stream.iterate(players)
        ys = stream.map(xs, my_func, task_limit=100)
        zs = stream.list(ys)
        results = await zs
    

    Same approach using pipes:

    async def main():
        results = await (
            stream.iterate(players) | 
            pipe.map(my_func, task_limit=100) |
            pipe.list())
    

    See the aiostream project page and the documentation for further information.

    Disclaimer: I am the project maintainer.

    0 讨论(0)
提交回复
热议问题