Concatenate two (or n) streams

时光毁灭记忆、已成空白 提交于 2019-12-03 05:04:16

问题


  • 2 streams:

    Given readable streams stream1 and stream2, what's an idiomatic (concise) way to get a stream containing stream1 and stream2 concatenated?

    I cannot do stream1.pipe(outStream); stream2.pipe(outStream), because then the stream contents are jumbled together.

  • n streams:

    Given an EventEmitter that emits an indeterminate number of streams, e.g.

    eventEmitter.emit('stream', stream1)
    eventEmitter.emit('stream', stream2)
    eventEmitter.emit('stream', stream3)
    ...
    eventEmitter.emit('end')
    

    what's an idiomatic (concise) way to get a stream with all streams concatenated together?


回答1:


The combined-stream package concatenates streams. Example from the README:

var CombinedStream = require('combined-stream');
var fs = require('fs');

var combinedStream = CombinedStream.create();
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));

combinedStream.pipe(fs.createWriteStream('combined.txt'));

I believe you have to append all streams at once. If the queue runs empty, the combinedStream automatically ends. See issue #5.

The stream-stream library is an alternative that has an explicit .end, but it's much less popular and presumably not as well-tested. It uses the streams2 API of Node 0.10 (see this discussion).




回答2:


A simple reduce operation should be fine in nodejs!

const {PassThrough} = require('stream')

let joined = [s0, s1, s2, ...sN].reduce((pt, s, i, a) => {
  s.pipe(pt, {end: false})
  s.once('end', () => a.every(s => s.ended) && pt.emit('end'))
  return pt
}, new PassThrough())

Cheers ;)




回答3:


this can be done with vanilla nodejs

import { PassThrough } from 'stream'
const merge = (...streams) => {
    let pass = new PassThrough()
    let waiting = streams.length
    for (let stream of streams) {
        pass = stream.pipe(pass, {end: false})
        stream.once('end', () => --waiting === 0 && pass.emit('end'))
    }
    return pass
}



回答4:


You might be able to make it more concise, but here's one that works:

var util = require('util');
var EventEmitter = require('events').EventEmitter;

function ConcatStream(streamStream) {
  EventEmitter.call(this);
  var isStreaming = false,
    streamsEnded = false,
    that = this;

  var streams = [];
  streamStream.on('stream', function(stream){
    stream.pause();
    streams.push(stream);
    ensureState();
  });

  streamStream.on('end', function() {
    streamsEnded = true;
    ensureState();
  });

  var ensureState = function() {
    if(isStreaming) return;
    if(streams.length == 0) {
      if(streamsEnded)
        that.emit('end');
      return;
    }
    isStreaming = true;
    streams[0].on('data', onData);
    streams[0].on('end', onEnd);
    streams[0].resume();
  };

  var onData = function(data) {
    that.emit('data', data);
  };

  var onEnd = function() {
    isStreaming = false;
    streams[0].removeAllListeners('data');
    streams[0].removeAllListeners('end');
    streams.shift();
    ensureState();
  };
}

util.inherits(ConcatStream, EventEmitter);

We keep track of state with streams (the queue of streams;push to the back and shift from the front), isStreaming, and streamsEnded. When we get a new stream, we push it, and when a stream ends, we stop listening and shift it. When the stream of streams ends, we set streamsEnded.

On each of these events, we check the state we're in. If we're already streaming (piping a stream), we do nothing. If the queue is empty and streamsEnded is set, we emit the end event. If there is something in the queue, we resume it and listen to its events.

*Note that pause and resume are advisory, so some streams may not behave correctly, and would require buffering. This exercise is left to the reader.

Having done all of this, I would do the n=2 case by constructing an EventEmitter, creating a ConcatStream with it, and emitting two stream events followed by an end event. I'm sure it could be done more concisely, but we may as well use what we've got.




回答5:


https://github.com/joepie91/node-combined-stream2 is a drop-in Streams2-compatible replacement for the combined-stream module (which is described above.) It automatically wraps Streams1 streams.

Example code for combined-stream2:

var CombinedStream = require('combined-stream2');
var fs = require('fs');

var combinedStream = CombinedStream.create();
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));

combinedStream.pipe(fs.createWriteStream('combined.txt'));



回答6:


streamee.js is a set of stream transformers and composers based on node 1.0+ streams and include a concatenate method:

var stream1ThenStream2 = streamee.concatenate([stream1, stream2]);



回答7:


In vanilla nodejs using ECMA 15+ and combining the good answers of Ivo and Feng.

The PassThrough class is a trivial Transform stream which does not modify the stream in any way.

const { PassThrough } = require('stream');

const concatStreams = (streamArray, streamCounter = streamArray.length) => streamArray
  .reduce((mergedStream, stream) => {
    // pipe each stream of the array into the merged stream
    // prevent the automated 'end' event from firing
    mergedStream = stream.pipe(mergedStream, { end: false });
    // rewrite the 'end' event handler
    // Every time one of the stream ends, the counter is decremented.
    // Once the counter reaches 0, the mergedstream can emit its 'end' event.
    stream.once('end', () => --streamCounter === 0 && mergedStream.emit('end'));
    return mergedStream;
  }, new PassThrough());

Can be used like this:

const mergedStreams = concatStreams([stream1, stream2, stream3]);


来源:https://stackoverflow.com/questions/16431163/concatenate-two-or-n-streams

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!