Perl's Catalyst framework permitts you to send an progressively flushed response over an open connection. You could for instance use write_fh()
on Catalyst::Response. I've begun using Node.js, and I can't find how to do the equivalent.
If I want to send a big CSV file, on the order of 200 megs is there a way to do that without buffering the whole CSV file in memory? Granted, the client will timeout if you don't send data in a certain amount of time, so a promise would be nice if -- but is there anyway to do this?
When I try to do a res.send(text)
in a callback, I get
Express
500 Error: This socket has been ended by the other party
And, it doesn't seem that Express.js supports an explicit socket.close()
or anything of the ilk.
Here is an example,
exports.foo = function (res) {
var query = client.query("SELECT * FROM naics.codes");
query.on('row', function(row) {
//console.log(row);
res.write("GOT A ROW");
});
query.on('end', function() {
res.end();
client.end();
});
};
I would expect for that to send "GOT A ROW" out for each row, until the call to client.end()
signifying completion.
Express is built on the native HTTP module, which means res
is an instance of http.ServerResponse
, which inherits from the writable stream interface. That said, you can do this:
app.get('/', function(req, res) {
var stream = fs.createReadStream('./file.csv');
stream.pipe(res);
// or use event handlers
stream.on('data', function(data) {
res.write(data);
});
stream.on('end', function() {
res.end();
});
});
The reason you can't use the res.send()
method in Express for streams is because it will use res.close()
automatically for you.
来源:https://stackoverflow.com/questions/18857693/does-express-js-support-sending-unbuffered-progressively-flushed-responses