How can I migrate CarrierWave files to a new storage mechanism?

前端 未结 4 792
南旧
南旧 2021-02-06 02:31

I have a Ruby on Rails site with models using CarrierWave for file handling, currently using local storage. I want to start using cloud storage and I need to migrate existing lo

相关标签:
4条回答
  • 2021-02-06 03:08

    When we use Heroku, most of people suggest to use cloudinary. Free and simple setup. My case is when we use cloudinary service and need move into aws S3 for some reasons.

    This is what i did with the uploader:

    class AvatarUploader < CarrierWave::Uploader::Base
    
      def self.set_storage
        if ENV['UPLOADER_SERVICE'] == 'aws'
          :fog
        else
          nil
        end
      end
    
      if ENV['UPLOADER_SERVICE'] == 'aws'
         include CarrierWave::MiniMagick
      else
         include Cloudinary::CarrierWave
      end
    
      storage set_storage
    end
    

    also, setup the rake task:

    task :migrate_cloudinary_to_aws do
        profile_image_old_url = []
        Profile.where("picture IS NOT NULL").each do |profile_image|
          profile_image_old_url << profile_image
        end
    
       ENV['UPLOADER_SERVICE'] = 'aws'
       load("#{Rails.root}/app/uploaders/avatar_uploader.rb")
    
       Profile.where("picture IS NOT NULL OR cover IS NOT NULL").each do |profile_image|
         old_profile_image = profile_image_old_url.detect { |image| image.id == profile_image.id }
         profile_image.remote_picture_url = old_profile_image.picture.url
         profile_image.save
       end
    end
    

    The trick is how to change the uploader provider by env variable. Good luck!

    0 讨论(0)
  • 2021-02-06 03:15

    I have migrated the Carrier wave files to Amazon s3 with s3cmd and it works.

    Here are the steps to follow:

    1. Change the storage kind of the uploader to fog.
    2. Create a bucket on Amazon s3 if you already dont have one.
    3. Install s3cmd on the remote server sudo apt-get install s3cmd
    4. Configure s3cmd s3cmd --configure. You would need to enter public and secret key here, provided by Amazon.
    5. Sync the files by this command s3cmd sync /path_to_your_files ://bucket_name/
    6. Set this flag --acl-public to upload the file as public and avoid permission issues.
    7. Restart your server

    Notes:

    sync will not duplicate your records. It will first check if the file is present on remote server or not.

    0 讨论(0)
  • 2021-02-06 03:16

    I'd try the following steps:

    1. Change the storage in the uploaders to :fog or what ever you want to use
    2. Write a migration like rails g migration MigrateFiles to let carrierwave get the current files, process them and upload them to the cloud.

    If your model looks like this:

    class Video
      mount_uploader :attachment, VideoUploader
    end
    

    The migration would look like this:

    @videos = Video.all
    @videos.each do |video|
      video.remote_attachment_url = video.attachment_url
      video.save
    end
    

    If you execute this migration the following should happen:

    Carrierwave downloads each image because you specified a remote url for the attachment(the current location, like http://test.com/images/1.jpg) and saves it to the cloud because you changed that in the uploader.

    Edit:

    Since San pointed out this will not work directly you should maybe create an extra column first, run a migration to copy the current attachment_urls from all the videos into that column, change the uploader after that and run the above migration using the copied urls in that new column. With another migration just delete the column again. Not that clean and easy but done in some minutes.

    0 讨论(0)
  • 2021-02-06 03:20

    Minimal to Possibly Zero Donwtime Procedure

    In my opinion, the easiest and fastest way to accomplish what you want with almost no downtime is this: (I will assume that you will use AWS cloud, but similar procedure is applicable to any cloud service)

    1. Figure out and setup your assets bucket, bucket policies etc for making the assets publicly accessible.
    2. Using s3cmd (command line tool for interacting with S3) or a GUI app, copy entire assets folder from file system to the appropriate folder in S3.
    3. In your app, setup carrierwave and update your models/uploaders for :fog storage.
    4. Do not restart your application yet. Instead bring up rails console and for your models, check that the new assets URL is correct and accessible as planned. For example, for a video model with picture asset, you can check this way:

      Video.first.picture.url
      

      This will give you a full cloud URL based on the updated settings. Copy the URL and paste in a browser to make sure that you can get to it fine.

    5. If this works for at least one instance of each model that has assets, you are good to restart your application.

    6. Upon restart, all your assets are being served from cloud, and you didn't need any migrations or multiple uploaders in your models.

    7. (Based on comment by @Frederick Cheung): Using s3cmd (or something similar) rsync or sync the assets folder from the filesystem to S3 to account for assets that were uploaded between steps 2 and 5, if any.

    PS: If you need help setting up carrierwave for cloud storage, let me know.

    0 讨论(0)
提交回复
热议问题