So I am trying to serve large files via a PHP script, they are not in a web accessible directory, so this is the best way I can figure to provide access to them.
The
If you want to do it right, PHP alone can't do it. You would want to serve the file by using Nginx's X-Accel-Redirect (Recommended) or Apache's X-Sendfile, which are built exactly for this purpose.
I will include in this answer some text found on this article.
NGINX handles all of these things properly. So let's handle permission checks in the application and let NGINX serve the actual file. This is where internal redirects come in. The idea is simple: you can configure a location entry as usual when serving regular files.
Add this to your nginx server block:
location /protected_files/ {
internal;
alias /var/www/my_folder_with_protected_files/;
}
In your project, require the HTTP Foundation package:
composer require symfony/http-foundation
Serve the files in PHP using Nginx:
use Symfony\Component\HttpFoundation\BinaryFileResponse;
$real_path = '/var/www/my_folder_with_protected_files/foo.pdf';
$x_accel_redirect_path = '/protected_files/foo.pdf';
BinaryFileResponse::trustXSendfileTypeHeader();
$response = new BinaryFileResponse( $real_path );
$response->headers->set( 'X-Accel-Redirect', $accel_file );
$response->sendHeaders();
exit;
This should be the basic you need to get started.
Here's a more complete example serving an Inline PDF:
use Symfony\Component\HttpFoundation\BinaryFileResponse;
use Symfony\Component\HttpFoundation\File\File;
use Symfony\Component\HttpFoundation\ResponseHeaderBag;
$real_path = '/var/www/my_folder_with_protected_files/foo.pdf';
$x_accel_redirect_path = '/protected_files/foo.pdf';
$file = new File( $file_path );
BinaryFileResponse::trustXSendfileTypeHeader();
$response = new BinaryFileResponse( $file_path );
$response->setImmutable( true );
$response->setPublic();
$response->setAutoEtag();
$response->setAutoLastModified();
$response->headers->set( 'Content-Type', 'application/pdf' );
$response->headers->set( 'Content-Length', $file->getSize() );
$response->headers->set( 'X-Sendfile-Type', 'X-Accel-Redirect' );
$response->headers->set( 'X-Accel-Redirect', $accel_file );
$response->headers->set( 'X-Accel-Expires', 60 * 60 * 24 * 90 ); // 90 days
$response->headers->set( 'X-Accel-Limit-Rate', 10485760 ); // 10mb/s
$response->headers->set( 'X-Accel-Buffering', 'yes' );
$response->setContentDisposition( ResponseHeaderBag::DISPOSITION_INLINE, basename( $file_path ) ); // view in browser. Change to DISPOSITION_ATTACHMENT to download
$response->sendHeaders();
exit;
Strange, neither fpassthru() nor readfile() did it for me, always had a memory error. I resorted to use passthru() without the 'f':
$name = 'mybigfile.zip';
// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));
// dump the file and stop the script
passthru('/bin/cat '.$filename);
exit;
this execs 'cat' Unix command and send its output to the browser.
comment for slim: the reason you just don't put a symlink to somewhere is webspace is SECURITY.
If your files are not accessible by the web server because the path is not in your web serving directory (htdocs) then you can make a symbolic link (symlink) to that folder in your web serving directory to avoid passing all traffic trough php.
You can do something like this
ln -s /home/files/big_files_folder /home/www/htdocs
Using php for serving static files is a lot slower, if you have high traffic, memory consumption will be very large and it may not handle a big number of requests.