问题
I need to copy one large data file to another destination with some modifications. fs.readFile
and fs.writeFile
are very slow. I need to read line by line, modify and write to new file. I found something like this:
fs.stat(sourceFile, function(err, stat){
var filesize = stat.size;
var readStream = fs.createReadStream(sourceFile);
// HERE I want do some modifications with bytes
readStream.pipe(fs.createWriteStream(destFile));
})
But how to make modifications ? I tried to get data with data
event
readStream.on('data', function(buffer){
var str = strToBytes(buffer);
str.replace('hello', '');
// How to write ???
});
but don't understand how to write it to file:
回答1:
You should use transform
stream and use pipes like this:
fs.createReadStream('input/file.txt')
.pipe(new YourTransformStream())
.pipe(fs.createWriteStream('output/file.txt'))
Then it's just a matter of implementing the transform stream as in this doc
You can also make this easier for you using scramjet like this:
fs.createReadStream('input/file.txt')
.pipe(new StringStream('utf-8'))
.split('\n') // split every line
.map(async (line) => await makeYourChangesTo(line)) // update the lines
.join('\n') // join again
.pipe(fs.createWriteStream('output/file.txt'))
Which I suppose is easier than doing that manually.
来源:https://stackoverflow.com/questions/45775480/node-js-modify-file-data-stream