- Desired Behaviour
- Actual Behaviour
- What I\'ve Tried
- Steps To Reproduce
- Research
The core problem to solve here is asynchronicity. You almost had it: the problem with the code you posted is that you are piping all source streams in parallel & unordered into the target stream. This means data
chunks will flow randomly from different audio streams - even your end
event will outrace the pipe
s without end
closing the target stream too early, which might explain why it increases after you re-open it.
What you want is to pipe them sequentially - you even posted the solution when you quoted
You want to add the second read into an eventlistener for the first read to finish...
or as code:
a.pipe(c, { end:false });
a.on('end', function() {
b.pipe(c);
}
This will pipe the source streams in sequential order into the target stream.
Taking your code this would mean to replace the audio_files.forEach
loop with:
await Bluebird.mapSeries(audio_files, async (audio, index) => {
const isLastIndex = index == audio_files_length - 1;
audio.pipe(write_stream, { end: isLastIndex });
return new Promise(resolve => audio.on('end', resolve));
});
Note the usage of bluebird.js mapSeries here.
Further advice regarding your code:
const
& let
instead of var
and consider using camelCase
Further reading, limitations of combining native node streams: https://github.com/nodejs/node/issues/93