Laravel 5: How do you copy a local file to Amazon S3?

后端 未结 6 2099
心在旅途
心在旅途 2021-02-08 11:53

I\'m writing code in Laravel 5 to periodically backup a MySQL database. My code thus far looks like this:

    $filename = \'database_backup_\'.date(\'G_a_m_d_y\         


        
相关标签:
6条回答
  • 2021-02-08 12:32

    You can always use a file resource to stream the file (advisable for large files) by doing something like this:

    Storage::disk('s3')->put('my/bucket/' . $filename, fopen('path/to/local/file', 'r+'));
    

    An alternative suggestion is proposed here. It uses Laravel's Storage facade to read the stream. The basic idea is something like this:

        $inputStream = Storage::disk('local')->getDriver()->readStream('/path/to/file');
        $destination = Storage::disk('s3')->getDriver()->getAdapter()->getPathPrefix().'/my/bucket/';
        Storage::disk('s3')->getDriver()->putStream($destination, $inputStream);
    
    0 讨论(0)
  • 2021-02-08 12:38

    Laravel has now putFile and putFileAs method to allow stream of file.

    Automatic Streaming

    If you would like Laravel to automatically manage streaming a given file to your storage location, you may use the putFile or putFileAs method. This method accepts either a Illuminate\Http\File or Illuminate\Http\UploadedFile instance and will automatically stream the file to your desired location:

    use Illuminate\Http\File;
    use Illuminate\Support\Facades\Storage;
    
    // Automatically generate a unique ID for file name...
    Storage::putFile('photos', new File('/path/to/photo'));
    
    // Manually specify a file name...
    Storage::putFileAs('photos', new File('/path/to/photo'), 'photo.jpg');
    

    Link to doc: https://laravel.com/docs/5.8/filesystem (Automatic Streaming)

    Hope it helps

    0 讨论(0)
  • 2021-02-08 12:42

    There is a way to copy files without needing to load the file contents into memory.

    You will also need to import the following:

    use League\Flysystem\MountManager;
    

    Now you can copy the file like so:

    $mountManager = new MountManager([
        's3' => \Storage::disk('s3')->getDriver(),
        'local' => \Storage::disk('local')->getDriver(),
    ]);
    $mountManager->copy('s3://path/to/file.txt', 'local://path/to/output/file.txt');
    
    0 讨论(0)
  • 2021-02-08 12:44

    You can try this code

    $contents = Storage::get($file);
    Storage::disk('s3')->put($newfile,$contents);
    

    As Laravel document this is the easy way I found to copy data between two disks

    0 讨论(0)
  • 2021-02-08 12:53

    Looking at the documentation the only way is using method put which needs file content. There is no method to copy file between 2 file systems so probably the solution you gave is at the moment the only one.

    If you think about it, finally when copying file from local file system to s3, you need to have file content to put it in S3, so indeed it's not so wasteful in my opinion.

    0 讨论(0)
  • 2021-02-08 12:53

    I solved it in the following way:

    $contents = \File::get($destination);
    \Storage::disk('s3')
        ->put($s3Destination,$contents);
    

    Sometimes we don't get the data using $contents = Storage::get($file); - storage function so we have to give root path of the data using Laravel File instead of storage path using Storage.

    0 讨论(0)
提交回复
热议问题