node-streams

Node.js Streaming/Piping Error Handling (Change Response Status on Error)

巧了我就是萌 提交于 2021-02-08 19:57:46
问题 I have millions of rows in my Cassandra db that I want to stream to the client in a zip file (don't want a potentially huge zip file in memory). I am using the stream() function from the Cassandra-Node driver, piping to a Transformer which extracts the one field from each row that I care about and appends a newline, and pipes to archive which pipes to the Express Response object. This seems to work fine but I can't figure out how to properly handle errors during streaming. I have to set the

Response streaming in Express does not work in Azure App Service

白昼怎懂夜的黑 提交于 2021-01-03 22:36:39
问题 I am trying to stream responses to my client using a NodeJS Express server hosted using Azure App Service. However, I noticed that it is not really streaming but tries to send the response as a whole. When the response size is huge (>50MB), the client gets an Internal Server Error , but the server does not throw an error. Further, when I run the server inside a Docker (Node Image: 10.22.0-alpine3.9 ), I see that the client gets the response as a stream even for huge responses. (This is the

node.js how to handle fast producer and slow consumer with backpressure

戏子无情 提交于 2020-02-02 14:55:11
问题 I'm very novice in node.js and don't understand the documentation about streams. Hoping to get some tips. I'm reading a very large file line, and then for each line I'm calling an async network api. Obviously the local file is read much faster than the async calls are completed: var lineReader = require('readline').createInterface({ input: require('fs').createReadStream(program.input) }); lineReader.on('line', function (line) { client.execute(query, [line], function(err, result) { // needs to

How to add additional data with data read from a node stream in read callback handler?

穿精又带淫゛_ 提交于 2020-01-25 07:21:07
问题 I'm creating an array of Readable streams (from files containing JSON docs) and am trying to pipe them to another stream. The data in the files is coming through… but for every object I receive in the piped to stream, I would like to know from which file this data originated from: var fs = require('fs'); var path = require('path'); var JSONStream = require('JSONStream'); var tmp1 = path.join(__dirname, 'data', 'tmp1.json'); var tmp2 = path.join(__dirname, 'data', 'tmp2.json'); var jsonStream

How to pipe multiple ReadableStreams to a single WriteStream?

强颜欢笑 提交于 2019-12-25 00:19:33
问题 I'm dealing with a firewall limit where I can only POST 10MB at a time. In order to handle larger uploads, I'd like to use something like http://www.resumablejs.com, write multiple chunks to disk, and recombine them at the end. I'm just writing tests now, but something in my implementation is wrong. First, I split the file like this: const splitFile = async () => { const chunkSize = 1024 * 1024; const photo = fs.createReadStream(path.resolve(FIXTURES, 'hello-tron.jpg')); // Write to 2 files

why does attempting to write a large file cause js heap to run out of memory

风流意气都作罢 提交于 2019-12-17 19:36:28
问题 this code const file = require("fs").createWriteStream("./test.dat"); for(var i = 0; i < 1e7; i++){ file.write("a"); } gives this error message after running for about 30 seconds <--- Last few GCs ---> [47234:0x103001400] 27539 ms: Mark-sweep 1406.1 (1458.4) -> 1406.1 (1458.4) MB, 2641.4 / 0.0 ms allocation failure GC in old space requested [47234:0x103001400] 29526 ms: Mark-sweep 1406.1 (1458.4) -> 1406.1 (1438.9) MB, 1986.8 / 0.0 ms last resort GC in old spacerequested [47234:0x103001400]

Pipe a stream to s3.upload()

蓝咒 提交于 2019-12-17 08:05:44
问题 I'm currently making use of a node.js plugin called s3-upload-stream to stream very large files to Amazon S3. It uses the multipart API and for the most part it works very well. However, this module is showing its age and I've already had to make modifications to it (the author has deprecated it as well). Today I ran into another issue with Amazon, and I would really like to take the author's recommendation and start using the official aws-sdk to accomplish my uploads. BUT. The official SDK

AWS Lambda & Node: Write data while streaming - ends prematurely and data is missing

狂风中的少年 提交于 2019-12-11 07:49:04
问题 I've got a Lambda function that is triggered by a write to an S3 bucket. It reads the JSON file that is written to the bucket, parses out the individual records, and writes them to a database. Problem is; I'm not sure what I'm doing wrong, because the stream ends and the Lambda exits before all the data is written. I'm in "flowing mode" on my readable stream, and I'm pausing/resuming during the db write. According to the docs, this should do the trick, but it's not working as expected. Lambda