Concatenate two (or n) streams

前端 未结 10 1233
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-24 02:34
  • 2 streams:

    Given readable streams stream1 and stream2, what\'s an idiomatic (concise) way to get a stream containing

相关标签:
10条回答
  • 2020-12-24 03:02

    This can now be easily done using async iterators

    async function* concatStreams(readables) {
      for (const readable of readables) {
        for await (const chunk of readable) { yield chunk }
      }
    } 
    

    And you can use it like this

    const fs = require('fs')
    const stream = require('stream')
    
    const files = ['file1.txt', 'file2.txt', 'file3.txt'] 
    const iterable = await concatStreams(files.map(f => fs.createReadStream(f)))
    
    // convert the async iterable to a readable stream
    const mergedStream = stream.Readable.from(iterable)
    

    More info regarding async iterators: https://2ality.com/2019/11/nodejs-streams-async-iteration.html

    0 讨论(0)
  • 2020-12-24 03:04

    In vanilla nodejs using ECMA 15+ and combining the good answers of Ivo and Feng.

    The PassThrough class is a trivial Transform stream which does not modify the stream in any way.

    const { PassThrough } = require('stream');
    
    const concatStreams = (streamArray, streamCounter = streamArray.length) => streamArray
      .reduce((mergedStream, stream) => {
        // pipe each stream of the array into the merged stream
        // prevent the automated 'end' event from firing
        mergedStream = stream.pipe(mergedStream, { end: false });
        // rewrite the 'end' event handler
        // Every time one of the stream ends, the counter is decremented.
        // Once the counter reaches 0, the mergedstream can emit its 'end' event.
        stream.once('end', () => --streamCounter === 0 && mergedStream.emit('end'));
        return mergedStream;
      }, new PassThrough());
    

    Can be used like this:

    const mergedStreams = concatStreams([stream1, stream2, stream3]);
    
    0 讨论(0)
  • 2020-12-24 03:10

    The below code worked for me :). Have taken the inputs from all the answers given earlier

      const pipeStreams = (streams) => {
      const out = new PassThrough()
      // Piping the first stream to the out stream
      // Also prevent the automated 'end' event of out stream from firing
      streams[0].pipe(out, { end: false })
      for (let i = 0; i < streams.length - 2; i++) {
        // On the end of each stream (until the second last) pipe the next stream to the out stream
        // Prevent the automated 'end' event of out stream from firing
        streams[i].on('end', () => {
          streams[i + 1].pipe(out, { end: false })
        })
      }
      // On the end of second last stream pipe the last stream to the out stream.
      // Don't prevent the 'end flag from firing'
      streams[streams.length - 2].on('end', () => {
        streams[streams.length - 1].pipe(out)
      })
      return out
    } 
    
    0 讨论(0)
  • 2020-12-24 03:13

    Both of the most upvoted answers here aren't working with asynchronous streams because they just pipe things on regardless whether the source stream is ready to produce. I had to combine in-memory string streams with data feed from a database, and the database content was always at the end of the resulting stream because it takes a second to get a db response. Here's what I ended up writing for my purposes.

    export function joinedStream(...streams: Readable[]): Readable {
      function pipeNext(): void {
        const nextStream = streams.shift();
        if (nextStream) {
          nextStream.pipe(out, { end: false });
          nextStream.on('end', function() {
            pipeNext();
          });
        } else {
          out.end();
        }
      }
      const out = new PassThrough();
      pipeNext();
      return out;
    }
    
    0 讨论(0)
提交回复
热议问题