I have a pretty strange problem working with read streams in Node.js. I\'m using SSH2 to create a sftp connection between me and a sftp server. I then try to create a read strea
If the byte size (or chunk size) is not mandatory and you just need to get the file, guess there is a much better lighter-and-faster way (yeah...the nodejs way!). This is how I use to copy a file:
function getFile(remoteFile, localFile) {
conn.on('ready', function () {
conn.sftp(function (err, sftp) {
if (err) throw err;
var rstream = sftp.createReadStream(remoteFile);
var wstream = fs.createWriteStream(localFile);
rstream.pipe(wstream);
rstream.on('error', function (err) { // To handle remote file issues
console.log(err.message);
conn.end();
rstream.destroy();
wstream.destroy();
});
rstream.on('end', function () {
conn.end();
});
wstream.on('finish', function () {
console.log(`${remoteFile} has successfully download to ${localFile}!`);
});
});
}).connect(m_ssh2Credentials);
}
As an alternative, you can also try sftp.fastGet()
which uses parallel reads to bring the file quickly. fastGet()
provides you a way to show the progress of the download (if wanted) apart from giving a way to configure the number of parallel reads and chunk size. To know more, open this SFTPStream doc and search for fastGet
.
Here is a very quick code:
sftp.fastGet(remoteFile, localFile, function (err) {
if (err) throw err;
console.log(`${remoteFile} has successfully download to ${localFile}!`);
}
HIH!