How to transfer/stream big data from/to child processes in node.js without using the blocking stdio?

后端 未结 1 542
臣服心动
臣服心动 2020-12-14 22:57

I have a bunch of (child)processes in node.js that need to transfer large amounts of data.

When I read the manual it says the the stdio and ipc inferface between the

相关标签:
1条回答
  • 2020-12-14 23:42

    I found a solution that seems to work: when spawning the child process you can pass options for stdio and setup a pipe to stream data.

    The trick is to add an additional element, and set it to 'pipe'.

    In the parent process stream to child.stdio[3].

    var opts = {
        stdio: [process.stdin, process.stdout, process.stderr, 'pipe']
    };
    var child = child_process.spawn('node', ['./child.js'], opts);
    
    // send data
    mySource.pipe(child.stdio[3]);
    
    //read data
    child.stdio[3].pipe(myHandler);
    

    In de child open stream for file descriptor 3.

    // read from it
    var readable = fs.createReadStream(null, {fd: 3});
    
    // write to it
    var writable = fs.createWriteStream(null, {fd: 3});
    

    Note that not every stream you get from npm works correctly, I tried JSONStream.stringify() but it created errors, but it worked after I piped it via through2. (no idea why that is).

    Edit: some observations: it seems the pipe is not always Duplex stream, so you might need two pipes. And there is something weird going on where in one case it only works if I also have a ipc channel, so 6 total: [stdin, stdout, stderr, pipe, pipe, ipc].

    0 讨论(0)
提交回复
热议问题