node-streams

How to pipe multiple readable streams, from multiple api requests, to a single writeable stream?

此生再无相见时 提交于 2019-12-09 15:57:09
问题 - Desired Behaviour - Actual Behaviour - What I've Tried - Steps To Reproduce - Research Desired Behaviour Pipe multiple readable streams, received from multiple api requests, to a single writeable stream. The api responses are from ibm-watson's textToSpeech.synthesize() method. The reason multiple requests are required is because the service has a 5KB limit on text input. Therefore a string of 18KB , for example, requires four requests to complete. Actual Behaviour The writeable stream file

wait for all streams to finish - stream a directory of files

只愿长相守 提交于 2019-12-04 02:23:52
I'm using client.upload in pkgcloud to upload a directory of files. What's the best way to execute a callback after all the streams have finished? Is there a built-in way to register each stream's "finish" event and execute a callback after they have all fired? var filesToUpload = fs.readdirSync("./local_path"); // will make this async for(let file of filesToUpload) { var writeStream = client.upload({ container: "mycontainer, remote: file }); // seems like i should register finish events with something writeStream.on("finish", registerThisWithSomething); fs.createReadStream("./local_path/" +

How to pipe multiple readable streams, from multiple api requests, to a single writeable stream?

流过昼夜 提交于 2019-12-04 01:56:59
- Desired Behaviour - Actual Behaviour - What I've Tried - Steps To Reproduce - Research Desired Behaviour Pipe multiple readable streams, received from multiple api requests, to a single writeable stream. The api responses are from ibm-watson's textToSpeech.synthesize() method. The reason multiple requests are required is because the service has a 5KB limit on text input. Therefore a string of 18KB , for example, requires four requests to complete. Actual Behaviour The writeable stream file is incomplete and garbled. The application seems to 'hang'. When I try and open the incomplete .mp3

ffmpeg mp3 streaming via node js

偶尔善良 提交于 2019-12-02 14:32:45
问题 var fs = require('fs'); var child = require('child_process'); var http=require('http') var input_file = fs.createReadStream('./remo.mp3'); http.createServer(function (req,res) { var args = ['-ss',120, '-i', 'remo.mp3', '-f','mp3', 'pipe:1' // Output on stdout ]; var trans_proc = child.spawn('ffmpeg', args); res.writeHead(200, { 'Content-Type': 'audio/mpeg' }); trans_proc.stdout.pipe(res) trans_proc.stderr.on('data',function (err) { console.log(err.toString()); }) }).listen(2000) i am trying

Does Node.js Stream Transform maintain the order of the chunks?

微笑、不失礼 提交于 2019-12-01 07:42:52
问题 I see in that the Transform Streams in the Node.js Stream API uses an asynchronous function to transform the chunks when they arrive: https://nodejs.org/api/stream.html#stream_transform_transform_chunk_encoding_callback Does the Transform stream sends the chunks at the same order as they arrive? Because with an asynchronous function, that is not explicitly the case. 回答1: Short answer is: yes, transform stream guaranties that chunks are sent in the same order. (Because Streams might be used

wait for all streams to finish - stream a directory of files

会有一股神秘感。 提交于 2019-11-30 02:54:23
问题 I'm using client.upload in pkgcloud to upload a directory of files. What's the best way to execute a callback after all the streams have finished? Is there a built-in way to register each stream's "finish" event and execute a callback after they have all fired? var filesToUpload = fs.readdirSync("./local_path"); // will make this async for(let file of filesToUpload) { var writeStream = client.upload({ container: "mycontainer, remote: file }); // seems like i should register finish events with

why does attempting to write a large file cause js heap to run out of memory

独自空忆成欢 提交于 2019-11-28 10:24:17
this code const file = require("fs").createWriteStream("./test.dat"); for(var i = 0; i < 1e7; i++){ file.write("a"); } gives this error message after running for about 30 seconds <--- Last few GCs ---> [47234:0x103001400] 27539 ms: Mark-sweep 1406.1 (1458.4) -> 1406.1 (1458.4) MB, 2641.4 / 0.0 ms allocation failure GC in old space requested [47234:0x103001400] 29526 ms: Mark-sweep 1406.1 (1458.4) -> 1406.1 (1438.9) MB, 1986.8 / 0.0 ms last resort GC in old spacerequested [47234:0x103001400] 32154 ms: Mark-sweep 1406.1 (1438.9) -> 1406.1 (1438.9) MB, 2628.3 / 0.0 ms last resort GC in old

Pipe a stream to s3.upload()

核能气质少年 提交于 2019-11-27 10:33:46
I'm currently making use of a node.js plugin called s3-upload-stream to stream very large files to Amazon S3. It uses the multipart API and for the most part it works very well. However, this module is showing its age and I've already had to make modifications to it (the author has deprecated it as well). Today I ran into another issue with Amazon, and I would really like to take the author's recommendation and start using the official aws-sdk to accomplish my uploads. BUT. The official SDK does not seem to support piping to s3.upload() . The nature of s3.upload is that you have to pass the