Downloading big files and writing it locally

前端 未结 2 1114
长发绾君心
长发绾君心 2021-02-11 07:43

which is the best way to download from php large files without consuming all server\'s memory?

I could do this (bad code):

$url=\'http://server/bigfile\'         


        
相关标签:
2条回答
  • 2021-02-11 08:18

    Here is a function I use when downloading large files. It will avoid loading the entire file into the buffer. Instead, it will write to the destination as it receives the bytes.

    function download($file_source, $file_target) 
    {
        $rh = fopen($file_source, 'rb');
        $wh = fopen($file_target, 'wb');
        if (!$rh || !$wh) {
            return false;
        }
    
        while (!feof($rh)) {
            if (fwrite($wh, fread($rh, 1024)) === FALSE) {
                return false;
            }
        }
    
        fclose($rh);
        fclose($wh);
    
        return true;
    }
    
    0 讨论(0)
  • 2021-02-11 08:20

    You can use curl and the option CURLOPT_FILE to save the downloaded content directly to a file.

    set_time_limit(0);
    $fp = fopen ('file', 'w+b');
    $ch = curl_init('http://remote_url/file');
    curl_setopt($ch, CURLOPT_TIMEOUT, 75);
    curl_setopt($ch, CURLOPT_FILE, $fp);
    curl_exec($ch);
    curl_close($ch);
    fclose($fp);
    
    0 讨论(0)
提交回复
热议问题