How to pipe multiple readable streams, from multiple api requests, to a single writeable stream?

前端 未结 4 1645
情歌与酒
情歌与酒 2021-02-07 11:01

- Desired Behaviour
- Actual Behaviour
- What I\'ve Tried
- Steps To Reproduce
- Research


4条回答
  •  遥遥无期
    2021-02-07 11:35

    The core problem to solve here is asynchronicity. You almost had it: the problem with the code you posted is that you are piping all source streams in parallel & unordered into the target stream. This means data chunks will flow randomly from different audio streams - even your end event will outrace the pipes without end closing the target stream too early, which might explain why it increases after you re-open it.

    What you want is to pipe them sequentially - you even posted the solution when you quoted

    You want to add the second read into an eventlistener for the first read to finish...

    or as code:

    a.pipe(c, { end:false });
    a.on('end', function() {
      b.pipe(c);
    }
    

    This will pipe the source streams in sequential order into the target stream.

    Taking your code this would mean to replace the audio_files.forEach loop with:

    await Bluebird.mapSeries(audio_files, async (audio, index) => {  
        const isLastIndex = index == audio_files_length - 1;
        audio.pipe(write_stream, { end: isLastIndex });
        return new Promise(resolve => audio.on('end', resolve));
    });
    

    Note the usage of bluebird.js mapSeries here.

    Further advice regarding your code:

    • you should consider using lodash.js
    • you should use const & let instead of var and consider using camelCase
    • when you notice "it works with one event, but fails with multiple" always think: asynchronicity, permutations, race conditions.

    Further reading, limitations of combining native node streams: https://github.com/nodejs/node/issues/93

提交回复
热议问题