I\'m trying to serve large zip files to users. When there are 2 concurrent connections, the server runs out of memory (RAM). I increased the amount of memory from 300MB to 4GB (
There are some ideas over in this thread. I don't know if the readfile() method will save memory, but it sounds promising.
You're sending the contents ($data) of this file via PHP?
If so, each Apache process handling this will end up growing to the size of this file, as that data will be cached.
Your ONLY solution is to not send file contents/data via PHP and simply redirect the user to a download URL on the filesystem.
Use a generated and unique symlink, or a hidden location.
You can't use $data with whole file data inside it. Try pass to this function not the content of file only it's path. Next send all headers once and after that read part of this file using fread(), echo that chunk, call flush() and repeat. If any other header will be send in the meantime then finally transfer will be corrupted.
Add your ini_set
before SESSION_START();
I searched too many scripts and advises and nothing worked for my 400MB PDF file. I ended up using mod_rewrite and here's the solution it works great https://unix.stackexchange.com/questions/88874/control-apache-referrer-to-restrict-downloads-in-htaccess-file The code only allows download from a referrer you specifies and forbids direct download
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://yourdomain.com/.* [NC]
RewriteRule .* - [F]
Symlink the big file to your document root (assuming its not an authorized only file), then let Apache handle it. (That way you can accept byte ranges as well)