nodejs express fs iterating files into array or object failing

匿名 (未验证) 提交于 2019-12-03 02:59:02

问题:

So Im trying to use the nodejs express FS module to iterate a directory in my app, store each filename in an array, which I can pass to my express view and iterate through the list, but Im struggling to do so. When I do a console.log within the files.forEach function loop, its printing the filename just fine, but as soon as I try to do anything such as:

var myfiles = []; var fs = require('fs'); fs.readdir('./myfiles/', function (err, files) { if (err) throw err;   files.forEach( function (file) {     myfiles.push(file);   }); }); console.log(myfiles); 

it fails, just logs an empty object. So Im not sure exactly what is going on, I think it has to do with callback functions, but if someone could walk me through what Im doing wrong, and why its not working, (and how to make it work), it would be much appreciated.

回答1:

The myfiles array is empty because the callback hasn't been called before you call console.log().

You'll need to do something like:

var fs = require('fs'); fs.readdir('./myfiles/',function(err,files){     if(err) throw err;     files.forEach(function(file){         // do something with each file HERE!     });  });  // because trying to do something with files here won't work because  // the callback hasn't fired yet. 

Remember, everything in node happens at the same time, in the sense that, unless you're doing your processing inside your callbacks, you cannot guarantee asynchronous functions have completed yet.

One way around this problem for you would be to use an EventEmitter:

var fs=require('fs'),     EventEmitter=require('events').EventEmitter,     filesEE=new EventEmitter(),     myfiles=[];  // this event will be called when all files have been added to myfiles filesEE.on('files_ready',function(){   console.dir(myfiles); });  // read all files from current directory fs.readdir('.',function(err,files){   if(err) throw err;   files.forEach(function(file){     myfiles.push(file);   });   filesEE.emit('files_ready'); // trigger files_ready event }); 


回答2:

fs.readdir is asynchronous (as with many operations in node.js). This means that the console.log line is going to run before readdir has a chance to call the function passed to it.

You need to either:

Put the console.log line within the callback function given to readdir, i.e:

fs.readdir('./myfiles/', function (err, files) { if (err) throw err;   files.forEach( function (file) {     myfiles.push(file);   });   console.log(myfiles); }); 

Or simply perform some action with each file inside the forEach.



回答3:

As several have mentioned, you are using an async method, so you have a nondeterministic execution path.

However, there is an easy way around this. Simply use the Sync version of the method:

var myfiles = []; var fs = require('fs');  var arrayOfFiles = fs.readdirSync('./myfiles/');  //Yes, the following is not super-smart, but you might want to process the files. This is how: arrayOfFiles.forEach( function (file) {     myfiles.push(file); }); console.log(myfiles); 

That should work as you want. However, using sync statements is not good, so you should not do it unless it is vitally important for it to be sync.

Read more here: fs.readdirSync



回答4:

I think it has to do with callback functions,

Exactly.

fs.readdir makes an asynchronous request to the file system for that information, and calls the callback at some later time with the results.

So function (err, files) { ... } doesn't run immediately, but console.log(myfiles) does.

At some later point in time, myfiles will contain the desired information.

You should note BTW that files is already an Array, so there is really no point in manually appending each element to some other blank array. If the idea is to put together the results from several calls, then use .concat; if you just want to get the data once, then you can just assign myfiles = files directly.

Overall, you really ought to read up on "Continuation-passing style".



回答5:

I faced the same problem, and basing on answers given in this post I've solved it with Promises, that seem to be of perfect use in this situation:

router.get('/', (req, res) => {   var viewBag = {}; // It's just my little habit from .NET MVC ;)    var readFiles = new Promise((resolve, reject) => {     fs.readdir('./myfiles/',(err,files) => {       if(err) {          reject(err);        } else {         resolve(files);       }     });   });    // showcase just in case you will need to implement more async operations before route will response   var anotherPromise = new Promise((resolve, reject) => {     doAsyncStuff((err, anotherResult) => {       if(err) {          reject(err);        } else {         resolve(anotherResult);       }     });   });    Promise.all([readFiles, anotherPromise]).then((values) => {     viewBag.files = values[0];     viewBag.otherStuff = values[1];     console.log(viewBag.files); // logs e.g. [ 'file.txt' ]     res.render('your_view', viewBag);   }).catch((errors) => {     res.render('your_view',{errors:errors}); // you can use 'errors' property to render errors in view or implement different error handling schema   }); }); 

Note: you don't have to push found files into new array because you already get an array from fs.readdir()'c callback. According to node docs:

The callback gets two arguments (err, files) where files is an array of the names of the files in the directory excluding '.' and '..'.

I belive this is very elegant and handy solution, and most of all - it doesn't require you to bring in and handle new modules to your script.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!