fwrite() more than 2 GiB? [duplicate]

巧了我就是萌 提交于 2020-01-01 03:21:10

问题


I have a set of files that I want to concatenate (each represents a part from a multi-part download).

Each splitted file is about 250MiB in size, and I have a variable number of them.

My concatenation logic is straight-forward:

if (is_resource($handle = fopen($output, 'xb')) === true)
{
    foreach ($parts as $part)
    {
        if (is_resource($part = fopen($part, 'rb')) === true)
        {
            while (feof($part) !== true)
            {
                fwrite($handle, fread($part, 4096));
            }

            fclose($part);
        }
    }

    fclose($handle);
}

It took me a while to trace it down but, apparently, whenever I have more than 8 individual parts (totaling 2GiB) my output file gets truncated to 2147483647 bytes (reported by sprintf('%u', $output)).

I suppose this is due to some kind of 32-bit internal counter used by fopen() or fwrite().

How can I work around this problem (preferably using only PHP)?


回答1:


As a workaround, you could use the shell. If the code must be portable, this would only include about two variants for Windows and Linux (covering MacOS as well).

Linux

cat file1.txt file2.txt  > file.txt

Windows

copy file1.txt+file1.txt file.txt

Note that when creating a command line, escaping the variable arguments is very important. Use escapeshellarg() to wrap the filenames (see http://de1.php.net/escapeshellarg).

To detect whether you are on Windows or Linux, have a look at the constant PHP_OS. (best explained here: http://www.php.net/manual/en/function.php-uname.php)



来源:https://stackoverflow.com/questions/19438203/fwrite-more-than-2-gib

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!