node and Error: EMFILE, too many open files

前端 未结 18 1705
终归单人心
终归单人心 2020-11-28 17:58

For some days I have searched for a working solution to an error

Error: EMFILE, too many open files

It seems that many people have the same proble

相关标签:
18条回答
  • 2020-11-28 18:20

    I am not sure whether this will help anyone, I started working on a big project with lot of dependencies which threw me the same error. My colleague suggested me to install watchman using brew and that fixed this problem for me.

    brew update
    brew install watchman
    

    Edit on 26 June 2019: Github link to watchman

    0 讨论(0)
  • 2020-11-28 18:21

    Using the graceful-fs module by Isaac Schlueter (node.js maintainer) is probably the most appropriate solution. It does incremental back-off if EMFILE is encountered. It can be used as a drop-in replacement for the built-in fs module.

    0 讨论(0)
  • 2020-11-28 18:21

    I did all the stuff above mentioned for same problem but nothing worked. I tried below it worked 100%. Simple config changes.

    Option 1 set limit (It won't work most of the time)

    user@ubuntu:~$ ulimit -n 65535
    

    check available limit

    user@ubuntu:~$ ulimit -n
    1024
    

    Option 2 To increase the available limit to say 65535

    user@ubuntu:~$ sudo nano /etc/sysctl.conf
    

    add the following line to it

    fs.file-max = 65535
    

    run this to refresh with new config

    user@ubuntu:~$ sudo sysctl -p
    

    edit the following file

    user@ubuntu:~$ sudo vim /etc/security/limits.conf
    

    add following lines to it

    root soft     nproc          65535    
    root hard     nproc          65535   
    root soft     nofile         65535   
    root hard     nofile         65535
    

    edit the following file

    user@ubuntu:~$ sudo vim /etc/pam.d/common-session
    

    add this line to it

    session required pam_limits.so
    

    logout and login and try the following command

    user@ubuntu:~$ ulimit -n
    65535
    

    Option 3 Just add below line in

    DefaultLimitNOFILE=65535
    

    to /etc/systemd/system.conf and /etc/systemd/user.conf

    0 讨论(0)
  • 2020-11-28 18:23

    cwait is a general solution for limiting concurrent executions of any functions that return promises.

    In your case the code could be something like:

    var Promise = require('bluebird');
    var cwait = require('cwait');
    
    // Allow max. 10 concurrent file reads.
    var queue = new cwait.TaskQueue(Promise, 10);
    var read = queue.wrap(Promise.promisify(batchingReadFile));
    
    Promise.map(files, function(filename) {
        console.log(filename);
        return(read(filename));
    })
    
    0 讨论(0)
  • 2020-11-28 18:25

    I just finished writing a little snippet of code to solve this problem myself, all of the other solutions appear way too heavyweight and require you to change your program structure.

    This solution just stalls any fs.readFile or fs.writeFile calls so that there are no more than a set number in flight at any given time.

    // Queuing reads and writes, so your nodejs script doesn't overwhelm system limits catastrophically
    global.maxFilesInFlight = 100; // Set this value to some number safeish for your system
    var origRead = fs.readFile;
    var origWrite = fs.writeFile;
    
    var activeCount = 0;
    var pending = [];
    
    var wrapCallback = function(cb){
        return function(){
            activeCount--;
            cb.apply(this,Array.prototype.slice.call(arguments));
            if (activeCount < global.maxFilesInFlight && pending.length){
                console.log("Processing Pending read/write");
                pending.shift()();
            }
        };
    };
    fs.readFile = function(){
        var args = Array.prototype.slice.call(arguments);
        if (activeCount < global.maxFilesInFlight){
            if (args[1] instanceof Function){
                args[1] = wrapCallback(args[1]);
            } else if (args[2] instanceof Function) {
                args[2] = wrapCallback(args[2]);
            }
            activeCount++;
            origRead.apply(fs,args);
        } else {
            console.log("Delaying read:",args[0]);
            pending.push(function(){
                fs.readFile.apply(fs,args);
            });
        }
    };
    
    fs.writeFile = function(){
        var args = Array.prototype.slice.call(arguments);
        if (activeCount < global.maxFilesInFlight){
            if (args[1] instanceof Function){
                args[1] = wrapCallback(args[1]);
            } else if (args[2] instanceof Function) {
                args[2] = wrapCallback(args[2]);
            }
            activeCount++;
            origWrite.apply(fs,args);
        } else {
            console.log("Delaying write:",args[0]);
            pending.push(function(){
                fs.writeFile.apply(fs,args);
            });
        }
    };
    
    0 讨论(0)
  • 2020-11-28 18:27

    I ran into this problem today, and finding no good solutions for it, I created a module to address it. I was inspired by @fbartho's snippet, but wanted to avoid overwriting the fs module.

    The module I wrote is Filequeue, and you use it just like fs:

    var Filequeue = require('filequeue');
    var fq = new Filequeue(200); // max number of files to open at once
    
    fq.readdir('/Users/xaver/Downloads/xaver/xxx/xxx/', function(err, files) {
        if(err) {
            throw err;
        }
        files.forEach(function(file) {
            fq.readFile('/Users/xaver/Downloads/xaver/xxx/xxx/' + file, function(err, data) {
                // do something here
            }
        });
    });
    
    0 讨论(0)
提交回复
热议问题