I'm using client.upload in pkgcloud to upload a directory of files. What's the best way to execute a callback after all the streams have finished? Is there a built-in way to register each stream's "finish" event and execute a callback after they have all fired?
var filesToUpload = fs.readdirSync("./local_path"); // will make this async
for(let file of filesToUpload) {
var writeStream = client.upload({
container: "mycontainer,
remote: file
});
// seems like i should register finish events with something
writeStream.on("finish", registerThisWithSomething);
fs.createReadStream("./local_path/" + file).pipe(writeStream);
}
One way to do it, is to generate a Promise
task for each upload, then utilizing Promise.all()
.
Assuming you are using ES6, the code would look something like this:
const uploadTasks = filesToUpload.map((file) => new Promise((resolve, reject) => {
var writeStream = client.upload({
container: "mycontainer,
remote: file
});
// seems like i should register finish events with something
writeStream.on("finish", resolve);
fs.createReadStream("./local_path/" + file).pipe(writeStream);
});
Promise.all(uploadTasks)
.then(() => { console.log('All uploads completed.'); });
Alternatively, if you have access to async / await
- you can utilize this. For example:
const uploadFile = (file) => new Promise((resolve, reject) => {
const writeStream = client.upload({
container: "mycontainer,
remote: file
});
writeStream.on("finish", resolve);
fs.createReadStream("./local_path/" + file).pipe(writeStream);
}
const uploadFiles = async (files) => {
for(let file of files) {
await uploadFile(file);
}
}
await uploadFiles(filesToUpload);
console.log('All uploads completed.');
Take a look at NodeDir,which has methods like readFilesStream
/ promiseFiles
etc.
来源:https://stackoverflow.com/questions/37229725/wait-for-all-streams-to-finish-stream-a-directory-of-files