Streaming Amazon S3 Objects From a Web Server Using Laravel

倖福魔咒の 提交于 2019-12-04 07:36:12

From discussion in comments, I have arrived at some key points that I would like to share.

Pre-Signed URLs

As @ceejayoz pointed out, pre-signed URLs are not a bad idea because:

  1. I can keep time as low as 10 seconds which is perfect for any redirects and to start download, but not enough for the link to be shared.
  2. My previous understanding was that the download has to finish in the given time. So if the link expires in 10 seconds, the download has to happen before that. But @ceejayoz pointed that is not the case. The download which have started is allowed to finish.
  3. With cloud front, I can also restrict on the IP address, to add more security.


IAM Roles

He also pointed out another not so great method - to create temporary IAM users. This is a maintenance nightmare if not done correctly, so only do if you know what you are doing.


Stream From S3

This is the method that I have chosen for now. Maybe later I will move to the first method.

Warning: If you stream, then your server is still the middleman and all the data will go via your server. So if it fails, or is slow, your download will be slow.

My first question was how to register stream wrapper:

Since I am using Laravel and laravel uses flysystem for S3 management, there was no easy way for me to get the S3Client. Hence I added additional package AWS SDK for Laravel in my composer.json

"aws/aws-sdk-php-laravel" : "~3.0"

Then I wrote my code as follows:

class FileDelivery extends Command implements SelfHandling
{
    private $client;
    private $remoteFile;
    private $bucket;

    public function __construct($remoteFile)
    {
        $this->client = AWS::createClient('s3');
        $this->client->registerStreamWrapper();
        $this->bucket = 'mybucket';
        $this->remoteFile = $remoteFile;
    }

    public function handle()
    {
        try
        {
            // First get the meta-data of the object.
            $headers = $this->client->headObject(array(
                'Bucket' => $this->bucket,
                'Key' => $this->remoteFile
            ));

            $headers = $headers['@metadata'];
            if($headers['statusCode'] !== 200)
            {
                throw new S3Exception();
            }
        }
        catch(S3Exception $e)
        {
            return 404;
        }

        // return appropriate headers before the stream starts.
        http_response_code($headers['statusCode']);
        header("Last-Modified: {$headers['headers']['last-modified']}");
        header("ETag: {$headers['headers']['etag']}");
        header("Content-Type: {$headers['headers']['content-type']}");
        header("Content-Length: {$headers['headers']['content-length']}");
        header("Content-Disposition: attachment; filename=\"{$this->filename}\"");

        // Since file sizes can be too large,
        // buffers can suffer because they cannot store huge amounts of data.
        // Thus we disable buffering before stream starts.
        // We also flush anything pending in buffer.
        if(ob_get_level())
        {
            ob_end_flush();
        }
        flush();

        // Start the stream.
        readfile("s3://{$this->bucket}/{$this->remoteFile}");
    }
}

My second question was Do I need to Disable output buffering in laravel?

The answer IMHO is yes. The buffering lets the data flushed immediately from the buffer, resulting in lower memory consumption. Since we are not using any laravel function to offload the data to client, this is not done by laravel and hence needs to be done by us.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!