I have a NodeJS app which sends HTTP get requests from various places in the code, some are even dependent (sending a request, waiting for a reply, processing it and based on re
The Async module has a number of control flow options that could help you. queue
sounds like a good fit, where you can limit concurrency.
I would use Deferreds and return one for every queued request. You can then add succeed/fail callbacks to the deferred promise after it has been queued.
var deferred = queue.add('http://example.com/something');
deferred.fail(function(error) { /* handle failure */ });
deferred.done(function(response) { /* handle response */ });
You can hold a [ url, deferred ]
pairs in your queue, and each time you dequeue a URL you'll also have the Deferred that goes with it, which you can resolve or fail after you process the request.
I think that you have answered your question already. A central queue that can throttle your requests is the way to go. The only problem here is that the queue has to have the full information of for the request and the callback(s) that should be used. I would abstract this in a QueueableRequest
object that could look something like this:
var QueueableRequest = function(url, params, httpMethod, success, failure){
this.url = url;
this.params = params;
...
}
//Then you can queue your request with
queue.add(new QueueableRequest({
"api.test.com",
{"test": 1},
"GET",
function(data){ console.log('success');},
function(err){ console.log('error');}
}));
Of course this is just sample code that could be much prettier, but I hope you get the picture.