node and socket.io multiple API calls hang after a while

后端 未结 2 1021
生来不讨喜
生来不讨喜 2021-01-28 12:19

I have a node express/ socket application where my express server makes several API calls with node-rest-client looping through elements in var jobs and when each finishes, it s

相关标签:
2条回答
  • 2021-01-28 13:20

    It turned out to be the client.get() request causing the error. Here is my code to fix this. It still errors, but at least the error is handled and wont cause the node server to crash. If there is a more eloquent way of handling this, please let me know!

    setInterval(function(){
      var jobs = ['J1', 'J2', 'J3', 'J4'];
      var full_data = {};
      for(var i = 0; i < jobs.length; i++){
        client.get("MY URL", function (data, response) {
            io.sockets.emit('progressbar', data);
          }).on('error', function (err) {
            console.log('something went wrong on the request', err.request.options);
        });
      }
      console.log(full_data);
    
    }, 5000)
    
    0 讨论(0)
  • 2021-01-28 13:22

    If your jobs array gets larger, then you may just have too many requests in flight at the same time. It could be:

    1. Because you run out of socket resources locally
    2. You overload the host you're making requests to and its way of dealing with too many requests from one source is to just hang up (this is the most likely explanation that matches the error you get).
    3. It takes you longer than 5 seconds to finishing doing all the requests so you're starting the next iteration before the previous one is done.

    I'd suggest the following solution to handle all those issues:

    const Promise = require('bluebird');
    const utils = require('utils');
    client.getAsync = utils.promisify(client.get);
    
    function runJobs() {
        var jobs = ['J1', 'J2', 'J3', 'J4'];
        var full_data = {};
        Promise.map(jobs, function(job) {
            return client.getAsync("MY URL").then(data => {
                io.emit('progressbar', data);
            }).catch(err => {
                console.log('something went wrong on the request', err.request.options);
                // eat the error on purpose to keep going
            });
        }, {concurrency: 5}).then(() => {
            // All done, process all final data here            
            // Then, schedule the next iteration
            setTimeout(runJobs, 5000);
        });
    }
    
    runJobs();
    

    This runs a max of 5 requests at a time (you can play with adjusting that number) which solves both items 1 and 2 above. And, instead of setInterval(), it uses a recurring setTimeout() so that it won't ever schedule the next iteration until the prior one is done (even if the target server gets really slow).

    0 讨论(0)
提交回复
热议问题