I have a bunch of (child)processes in node.js that need to transfer large amounts of data.
When I read the manual it says the the stdio and ipc inferface between the
I found a solution that seems to work: when spawning the child process you can pass options for stdio
and setup a pipe to stream data.
The trick is to add an additional element, and set it to 'pipe'.
In the parent process stream to child.stdio[3]
.
var opts = {
stdio: [process.stdin, process.stdout, process.stderr, 'pipe']
};
var child = child_process.spawn('node', ['./child.js'], opts);
// send data
mySource.pipe(child.stdio[3]);
//read data
child.stdio[3].pipe(myHandler);
In de child open stream for file descriptor 3.
// read from it
var readable = fs.createReadStream(null, {fd: 3});
// write to it
var writable = fs.createWriteStream(null, {fd: 3});
Note that not every stream you get from npm works correctly, I tried JSONStream.stringify()
but it created errors, but it worked after I piped it via through2
. (no idea why that is).
Edit: some observations: it seems the pipe is not always Duplex stream, so you might need two pipes. And there is something weird going on where in one case it only works if I also have a ipc channel, so 6 total: [stdin, stdout, stderr, pipe, pipe, ipc].