How to force download of big files without using too much memory?

后端 未结 6 1184
清歌不尽
清歌不尽 2021-02-15 06:33

I\'m trying to serve large zip files to users. When there are 2 concurrent connections, the server runs out of memory (RAM). I increased the amount of memory from 300MB to 4GB (

相关标签:
6条回答
  • 2021-02-15 07:00

    There are some ideas over in this thread. I don't know if the readfile() method will save memory, but it sounds promising.

    0 讨论(0)
  • 2021-02-15 07:00

    You're sending the contents ($data) of this file via PHP?

    If so, each Apache process handling this will end up growing to the size of this file, as that data will be cached.

    Your ONLY solution is to not send file contents/data via PHP and simply redirect the user to a download URL on the filesystem.

    Use a generated and unique symlink, or a hidden location.

    0 讨论(0)
  • 2021-02-15 07:01

    You can't use $data with whole file data inside it. Try pass to this function not the content of file only it's path. Next send all headers once and after that read part of this file using fread(), echo that chunk, call flush() and repeat. If any other header will be send in the meantime then finally transfer will be corrupted.

    0 讨论(0)
  • 2021-02-15 07:03

    Add your ini_set before SESSION_START();

    0 讨论(0)
  • 2021-02-15 07:11

    I searched too many scripts and advises and nothing worked for my 400MB PDF file. I ended up using mod_rewrite and here's the solution it works great https://unix.stackexchange.com/questions/88874/control-apache-referrer-to-restrict-downloads-in-htaccess-file The code only allows download from a referrer you specifies and forbids direct download

     RewriteEngine On
    RewriteCond %{HTTP_REFERER} !^http://yourdomain.com/.* [NC]
    RewriteRule .* - [F]
    
    0 讨论(0)
  • 2021-02-15 07:16

    Symlink the big file to your document root (assuming its not an authorized only file), then let Apache handle it. (That way you can accept byte ranges as well)

    0 讨论(0)
提交回复
热议问题