My php script running on CentOS 5.6 and PHP 5.2.12 using ZipArchive() and successfully creates .zip files over 1.6Gb but not for a larger archive of 2GB or larger - PHP aborts with no apparent error. Nothing in the PHP error log or stderr. The script is being executed at the cmd line and not interactively.
The script runs for about 8min and the temp archive grows and while checking the filesize, the last listing showed the tmp file was 2120011776 in size and then the tmp file disappears and the PHP script falls thru the logic and executes the code after the archive create.
For some reason top shows the CPU still at 95% and is creating a new tmp archive file - it does this for say another 5+ min and silently stops and leaves the un-completed tmp archive file. In this test - there was less then 4000 expected files.
The script as noted works just fine creating smaller archive files.
Tested on several different sets of large source data - same result for large files.
This issue sounds similar to this question: Size limit on PHP's zipArchive class?
I thought maybe the ls -l command was returning a count of 2K blocks and thus 2120011776 would be close to 4GB but that size is in bytes - the size of the xxxx.zip.tmpxx file.
Thanks!
It could be many things. I'm assuming that you have enough free disk space to handle the process. As others have mentioned, there could be some problems fixed either by editing your php.ini file or using the ini_set() function in the code itself.
How much memory does your machine have? If it exhausts your actual memory, then it makes sense that it would abort regularly after a certain size. So, check the free memory usage before the script and monitor it as the script executes.
A third option could be based on the file system itself. I don't have much experience with CentOS, but some file systems do not allow files over 2 gb. Although, from the product page, it seems like most systems on CentOS can handle it.
A fourth option, which seems to be the most promising, appears if you look at the product page linked above, another possible culprit is "Maximum x86 per-process virtual address space," which is approximately 3gb. x86_64 is about 2tb, so check the type of processor.
Again, it seems like the fourth option is the culprit.
Do you have use set_limit variables in php.
You can use the. Htacess or within the PHP script. Inside the script set_time_limit(0); Inside the .htaccess php_value memory_limit 214572800;
When your file size is big it will take time to make its archive ZIP, but in PHP (php.ini) maximum execution time, so you must try to increase that value.
there is a setting in php.ini maximum execution time
perhaps this is getting fired !
try to increase the value !
There is also different file size limit for OS, try to check that too !
来源:https://stackoverflow.com/questions/5745255/php-aborting-when-creating-large-zip-file