Serving large files with PHP

前端 未结 9 945
北海茫月
北海茫月 2020-12-13 07:51

So I am trying to serve large files via a PHP script, they are not in a web accessible directory, so this is the best way I can figure to provide access to them.

The

相关标签:
9条回答
  • 2020-12-13 08:08

    You don't need to read the whole thing - just enter a loop reading it in, say, 32Kb chunks and sending it as output. Better yet, use fpassthru which does much the same thing for you....

    $name = 'mybigfile.zip';
    $fp = fopen($name, 'rb');
    
    // send the right headers
    header("Content-Type: application/zip");
    header("Content-Length: " . filesize($name));
    
    // dump the file and stop the script
    fpassthru($fp);
    exit;
    

    even less lines if you use readfile, which doesn't need the fopen call...

    $name = 'mybigfile.zip';
    
    // send the right headers
    header("Content-Type: application/zip");
    header("Content-Length: " . filesize($name));
    
    // dump the file and stop the script
    readfile($name);
    exit;
    

    If you want to get even cuter, you can support the Content-Range header which lets clients request a particular byte range of your file. This is particularly useful for serving PDF files to Adobe Acrobat, which just requests the chunks of the file it needs to render the current page. It's a bit involved, but see this for an example.

    0 讨论(0)
  • 2020-12-13 08:11

    While fpassthru() has been my first choice in the past, the PHP manual actually recommends* using readfile() instead, if you are just dumping the file as-is to the client.

    * "If you just want to dump the contents of a file to the output buffer, without first modifying it or seeking to a particular offset, you may want to use the readfile(), which saves you the fopen() call." —PHP manual

    0 讨论(0)
  • 2020-12-13 08:11

    The Python answers are all good. But is there any reason you can't make a web accessible directory containing symbolic links to the actual files? It may take some extra server configuration, but it ought to work.

    0 讨论(0)
  • 2020-12-13 08:13

    Have a look at fpassthru(). In more recent versions of PHP this should serve the files without keeping them in memory, as this comment states.

    0 讨论(0)
  • 2020-12-13 08:15

    One of benefits of fpassthru() is that this function can work not only with files but any valid handle. Socket for example.

    And readfile() must be a little faster, cause of using OS caching mechanism, if possible (as like as file_get_contents()).

    One more tip. fpassthru() hold handle open, until client gets content (which may require quite a long time on slow connect), and so you must use some locking mechanism if parallel writes to this file possible.

    0 讨论(0)
  • 2020-12-13 08:16

    The best way to send big files with php is the X-Sendfile header. It allows the webserver to serve files much faster through zero-copy mechanisms like sendfile(2). It is supported by lighttpd and apache with a plugin.

    Example:

    $file = "/absolute/path/to/file"; // can be protected by .htaccess
    header('X-Sendfile: '.$file);
    header('Content-type: application/octet-stream');
    header('Content-Disposition: attachment; filename="'.basename($file).'"');
    // other headers ...
    exit;
    

    The server reads the X-Sendfile header and sends out the file.

    0 讨论(0)
提交回复
热议问题