node.js-stream

Creating a Node.js stream from two piped streams

久未见 提交于 2019-11-27 01:21:07
问题 I'd like to combine two Node.js streams into one by piping them, if possible. I'm using Transform streams. In other words, I'd like my library to return myStream for people to use. For example they could write: process.stdin.pipe(myStream).pipe(process.stdout); And internally I'm using a third-party vendorStream that does some work, plugged into my own logic contained in myInternalStream . So what's above would translate to: process.stdin.pipe(vendorStream).pipe(myInternalStream).pipe(process

How to wrap a buffer as a stream2 Readable stream?

情到浓时终转凉″ 提交于 2019-11-27 00:39:43
How can I transform a node.js buffer into a Readable stream following using the stream2 interface ? I already found this answer and the stream-buffers module but this module is based on the stream1 interface. With streamifier you can convert strings and buffers to readable streams with the new stream api. zjonsson The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream. var stream = require('stream'); // Initiate the source var bufferStream = new stream

How to pipe one readable stream into two writable streams at once in Node.js?

二次信任 提交于 2019-11-26 20:20:52
问题 The goal is to: Create a file read stream. Pipe it to gzip ( zlib.createGzip() ) Then pipe the read stream of zlib output to: 1) HTTP response object 2) and writable file stream to save the gzipped output. Now I can do down to 3.1: var gzip = zlib.createGzip(), sourceFileStream = fs.createReadStream(sourceFilePath), targetFileStream = fs.createWriteStream(targetFilePath); response.setHeader('Content-Encoding', 'gzip'); sourceFileStream.pipe(gzip).pipe(response); ... which works fine, but I

Node.js: splitting stream content for n-parts

前提是你 提交于 2019-11-26 17:24:14
问题 I'm trying to understand node streams and their life-cycle. So, I want to split the content of a stream for n-parts. The code below is just to explain my intentions and to show that I already try something by myself. I omitted some details I have a stream which just generates some data(just a sequence of numbers): class Stream extends Readable { constructor() { super({objectMode: true, highWaterMark: 1}) this.counter = 0 } _read(size) { if(this.counter === 30) { this.push(null) } else { this

Node.js Piping the same readable stream into multiple (writable) targets

扶醉桌前 提交于 2019-11-26 15:19:04
I need to run two commands in series that need to read data from the same stream. After piping a stream into another the buffer is emptied so i can't read data from that stream again so this doesn't work: var spawn = require('child_process').spawn; var fs = require('fs'); var request = require('request'); var inputStream = request('http://placehold.it/640x360'); var identify = spawn('identify',['-']); inputStream.pipe(identify.stdin); var chunks = []; identify.stdout.on('data',function(chunk) { chunks.push(chunk); }); identify.stdout.on('end',function() { var size = getSize(Buffer.concat