NodeJs: slow req.pipe

时间秒杀一切 提交于 2019-12-24 13:50:26

问题


I've discovered that the server implementation of tus (https://tus.io) for nodejs (https://github.com/tus/tus-node-server) is really slow in comparison with the go implementation (https://github.com/tus/tusd).

Here you may find a comparison between the different implementations (running locally, same machine, same input)

nodejs:

[2019-01-31 16:22:45,578] INFO Uploading 52428800 bytes chunk from offset: 104857600
[2019-01-31 16:22:47,329] INFO Total bytes sent: 157286400 (kb/s: 29930)

go:

 [2019-01-31 16:26:31,894] INFO Uploading 52428800 bytes chunk from offset: 104857600
    [2019-01-31 16:26:32,348] INFO Total bytes sent: 209715200 (kb/s: 115639)

I've explored the tus-node-server codebase and then I've build a really simplified implementation of the server (i tried to reduce the possible overhead).

This is the code:

const fs = require('fs');
const express = require('express');
const app = express();

let offset = 0;
let len = Math.pow(2,30);

app.post('/files',(req,res) => {
    console.log("post received");
    res.set({
        'Location': 'http://localhost:8888/files/test',
        'Tus-Resumable': '1.0.0',
    });
    res.status(201).end();
});

app.options('/files',(req,res) => {
    console.log("options received");
    res.set({
        'Location': 'http://localhost:8888/files/test',
        'Tus-Resumable': '1.0.0',
        'Tus-Version': '1.0.0,0.2.2,0.2.1'
    });
    res.status(200).end();
});

app.head('/files/test',(req,res) => {
    console.log("options received");
    res.set({
        'Upload-Offset': offset,
        'Upload-Length': len
    });
    res.status(200).end();
});

app.patch('/files/test',(req, res) => {
    let localOffset = parseInt(req.get('Upload-Offset'), 10);
    // the file is pre-created
    const path = `./file.tmp`;
    const options = {
        flags: 'r+',
        start: localOffset
    };

    const stream = fs.createWriteStream(path, options);

    let new_offset = 0;
    req.on('data', (buffer) => {
        new_offset += buffer.length;
    });


    return req.pipe(stream).on('finish', () => {

        localOffset += new_offset;

        offset = localOffset;

        res.set({
            'Upload-Offset': offset,
            'Upload-Length': len
        });
        res.status(204).end();
    });


});

const host = 'localhost';
const port = 8888;
app.listen(port, host, (err, resp) => {
    if(err) {
        console.error(err);
        return
    }
    console.log('listening')
});

I think that the poor performance are due to the following code block:

const stream = fs.createWriteStream(path, options);
req.pipe(stream)

I've also check the file copy using a pipe and i got good performance (similar to go implementation)

const fs = require('fs');
const path = require('path');
const from = path.normalize(process.argv[2]);
const to = path.normalize(process.argv[3]);

const readOpts = {}; // {highWaterMark: Math.pow(2,16)};
const writeOpts ={}; // {highWaterMark: Math.pow(2,16)};

const startTs = Date.now();
const source = fs.createReadStream(from, readOpts);
const dest = fs.createWriteStream(to, writeOpts);
let offset = 0;

source.on('data', (buffer) => {
    offset += buffer.length;
});

dest.on('error', (e) => {
    console.log('[FileStore] write: Error', e);
});

source.pipe(dest).on('finish',() => {
    const endTs = Date.now();
    const kbs = (offset / (endTs - startTs)) / 1000;
    console.log("SPEED: ", kbs, offset);
});

so the bottleneck seems to be processing of the request an the piping.

Could you please help me to understand what happen and why is so slow compared with the go version


回答1:


I think you have a highWaterMark problem here.

The difference between your test are due to:

  • req has a highWaterMark of 16 kb
  • createReadStream has a highWaterMark of 64 kb

You can see the value adding:

console.log('readableHighWaterMark', req.readableHighWaterMark);

Instead, assuming your networking latency is negligible (because you are localhost), You can try to create the writeStream with a bigger water mark:

const options = {
    flags: 'w',
    start: localOffset,
    highWaterMark: 1048576
};
const stream = fs.createWriteStream(path, options);

This should speed up the write but will cost more RAM.



来源:https://stackoverflow.com/questions/54464491/nodejs-slow-req-pipe

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!