How Can I Effectively 'Max Out' Concurrent HTTP Requests?

前端 未结 2 1339
时光取名叫无心
时光取名叫无心 2020-12-13 22:29

I\'m currently trying a bit of an experiment with Go. Here\'s what I\'m attempting to do:

I\'ve got a REST API service running, and I\'d like to query a specific URL

2条回答
  •  醉梦人生
    2020-12-13 23:12

    You're almost certainly running into a file descriptor limit. The default limit is 2560 (the old limit was 256, but I think they x10'd it at some point). I'm fairly certain the highest you can set it is 10,000.

    I don't know if you'll ever be able to get a million simultaneous connections out of one machine this way. You may want to try a hybrid of processes and goroutines: 10k processes at 1000 goroutines per process, but I would not be surprised if you run into the systemwide limits anyway.

    To get what you want, I believe you're going to need to rate limit (with a buffered channel semaphore) so that you're not making more than several thousand connections at the same time if the goal is just to hit the API as hard as you can simply and from one host (and one network card).

提交回复
热议问题