问题
I try to run PHPUnit Tests in my new machine and I get this error:
PHP Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/usr/lib/php/pear/File/Iterator): failed to open dir: Too many open files' in /usr/lib/php/pear/File/Iterator/Factory.php:114
The same code on the old machine run well...
New machine environment: PHP Version: PHP 5.3.21 (cli) Older: PHP 5.3.14
PHPUnit output every time:
................EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE 65 / 66 ( 98%)
E
Time: 34 seconds, Memory: 438.50Mb
There were 50 errors:
1) XXXXXXXXXXX
PHP Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/usr/lib/php/pear/File/Iterator): failed to open dir: Too many open files' in /usr/lib/php/pear/File/Iterator/Factory.php:114
回答1:
This can be a limitation on the server where the code is running. Every operating system only allows for a certain number of open files/handles/sockets. This limit is usually further reduced when the server is virtualized. On a Linux server you can check the current limit with ulimit -n
, if you have root access you can increase it with the same command. I assume there is a method for Windows server as well. Otherwise there is not much you can do about it (except ask your hoster or administrator to increase it).
More configurable limitations:
In /etc/security/limits.conf
soft nofile 1024
hard nofile 65535
Increase ulimit by "ulimit -n 65535"
echo 65535 > /proc/sys/fs/file-max
In /etc/sysctl.conf
fs.file-max=65535
回答2:
How can you up file open limit (Linux or Max OS):
ulimit -n 10000
Solves problem with phpunit
or/and phpdbg
and Warning: Uncaught ErrorException: require([..file]): failed to open stream: Too many open files in [...]
回答3:
In php, before the execution, try this
exec('ulimit -S -n 2048');
回答4:
Don't store DirectoryIterator objects for later; you will get an error saying "too many open files" when you store more than the operating system limit (usually 256 or 1024).
For example, this will yield an error if the directory has too many files:
<?php
$files = array();
foreach (new DirectoryIterator('myDir') as $file) {
$files[] = $file;
}
?>
Presumably, this approach is memory intensive as well.
source: http://php.net/manual/pt_BR/directoryiterator.construct.php#87425
回答5:
After 'waking' my computer from sleep mode I ran into this problem.
Restarting php-fpm like so fixed it. Classic turn it off & back on again solution.
sudo /etc/init.d/php-fpm restart
I think this may be related to xdebug which I recently added to php.
回答6:
Maybe, you have some error with file /etc/init.d/phpx.x-fpm
.
Let’s restart it:
sudo /etc/init.d/php7.2-fpm restart
回答7:
on server debian you can go to also to
/etc/php/php7.xx/fpm/pool.d/www.conf
rlimit_files = 10000
/etc/init.d/php7.xx restart
来源:https://stackoverflow.com/questions/14748499/fatal-error-too-many-open-files