Send Simultaneous Requests python (all at once)

前端 未结 3 800
别跟我提以往
别跟我提以往 2021-01-04 12:34

I\'m trying to create a script that send\'s over 1000 requests to one page at the same time. But requests library with threading (1000) threads. Seems to be doing to first 5

相关标签:
3条回答
  • 2021-01-04 13:01

    I have generally found that the best solution is to use an asynchronous library like tornado. The easiest solution that I found however is to use ThreadPoolExecutor.


    import requests
    from concurrent.futures import ThreadPoolExecutor
    
    def get_url(url):
        return requests.get(url)
    with ThreadPoolExecutor(max_workers=50) as pool:
        print(list(pool.map(get_url,list_of_urls)))
    
    0 讨论(0)
  • 2021-01-04 13:02

    Assumed that you know what you are doing, I first suggest you to implement a backoff policy with a jitter to prevent "predictable thundering hoardes" to your server. That said, you should consider to do some threading

    import threading
    class FuncThread(threading.Thread):
        def __init__(self, target, *args):
            self._target = target
            self._args = args
            threading.Thread.__init__(self)
    
        def run(self):
            self._target(*self._args)
    

    so that you would do something like

    t = FuncThread(doApiCall, url)
    t.start()
    

    where your method doApiCall is defined like this

    def doApiCall(self, url):
    
    0 讨论(0)
  • 2021-01-04 13:24

    I know this is an old question, but you can now do this using asyncio and aiohttp.

    import asyncio
    import aiohttp
    from aiohttp import ClientSession
    
    async def fetch_html(url: str, session: ClientSession, **kwargs) -> str:
        resp = await session.request(method="GET", url=url, **kwargs)
        resp.raise_for_status()
        return await resp.text()
    
    async def make_requests(url: str, **kwargs) -> None:
        async with ClientSession() as session:
            tasks = []
            for i in range(1,1000):
                tasks.append(
                    fetch_html(url=url, session=session, **kwargs)
                )
            results = await asyncio.gather(*tasks)
            # do something with results
    
    if __name__ == "__main__":
        asyncio.run(make_requests(url='http://test.net/'))
    

    You can read more about it and see an example here.

    0 讨论(0)
提交回复
热议问题