node.js how to handle fast producer and slow consumer with backpressure

戏子无情 提交于 2020-02-02 14:55:11

问题


I'm very novice in node.js and don't understand the documentation about streams. Hoping to get some tips.

I'm reading a very large file line, and then for each line I'm calling an async network api.

Obviously the local file is read much faster than the async calls are completed:

var lineReader = require('readline').createInterface({
  input: require('fs').createReadStream(program.input)
});

lineReader.on('line', function (line) {
    client.execute(query, [line], function(err, result) {
        // needs to pressure the line reader here
        var myJSON = JSON.stringify(result);
        console.log("line=%s json=%s",myJSON);
    });
});

What is the way to add back pressure in the "execute" method?


回答1:


The solution is to wrap the async behavior in a stream writer and throttle the async reader from within the writer:

val count = 0;
var writable = new stream.Writable({
    write: function (line, encoding, next) {
        count++;
        if (count < concurrent) {
            next();
        }

        asyncFunctionToCall(...) {
            // completion callback
            // reduce the count and release back pressure
            count--;
            next();
            ...
      }
});

var stream = fs.createReadStream(program.input, {encoding: 'utf8'});
stream = byline.createStream(stream);
stream.pipe(writable);


来源:https://stackoverflow.com/questions/50114753/node-js-how-to-handle-fast-producer-and-slow-consumer-with-backpressure

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!