How to delete files recursively from an S3 bucket

后端 未结 12 465
不知归路
不知归路 2021-01-29 23:54

I have the following folder structure in S3. Is there a way to recursively remove all files under a certain folder (say foo/bar1 or foo or foo/bar2/1 ..)

         


        
相关标签:
12条回答
  • 2021-01-30 00:20

    I needed to do the following...

    def delete_bucket
      s3 = init_amazon_s3
      s3.buckets['BUCKET-NAME'].objects.each do |obj|
        obj.delete
      end
    end
    
    def init_amazon_s3
      config = YAML.load_file("#{Rails.root}/config/s3.yml")
      AWS.config(:access_key_id => config['access_key_id'],:secret_access_key => config['secret_access_key'])
      s3 = AWS::S3.new
    end
    
    0 讨论(0)
  • 2021-01-30 00:23

    With the latest aws-cli python command line tools, to recursively delete all the files under a folder in a bucket is just:

    aws s3 rm --recursive s3://your_bucket_name/foo/
    

    Or delete everything under the bucket:

    aws s3 rm --recursive s3://your_bucket_name
    

    If what you want is to actually delete the bucket, there is one-step shortcut:

    aws s3 rb --force s3://your_bucket_name
    

    which will remove the contents in that bucket recursively then delete the bucket.

    Note: the s3:// protocol prefix is required for these commands to work

    0 讨论(0)
  • 2021-01-30 00:27

    In case using AWS-SKD for ruby V2.

    s3.list_objects(bucket: bucket_name, prefix: "foo/").contents.each do |obj|
      next if obj.key == "foo/" 
      resp = s3.delete_object({
        bucket: bucket_name,
        key: obj.key,
      })
    end
    

    attention please, all "foo/*" under bucket will delete.

    0 讨论(0)
  • 2021-01-30 00:28

    You might also consider using Amazon S3 Lifecycle to create an expiration for files with the prefix foo/bar1.

    Open the S3 browser console and click a bucket. Then click Properties and then LifeCycle.

    Create an expiration rule for all files with the prefix foo/bar1 and set the date to 1 day since file was created.

    Save and all matching files will be gone within 24 hours.

    Just don't forget to remove the rule after you're done!

    No API calls, no third party libraries, apps or scripts.

    I just deleted several million files this way.

    A screenshot showing the Lifecycle Rule window (note in this shot the Prefix has been left blank, affecting all keys in the bucket):

    enter image description here

    0 讨论(0)
  • 2021-01-30 00:28

    The voted up answer is missing a step.

    Per aws s3 help:

    Currently, there is no support for the use of UNIX style wildcards in a command's path arguments. However, most commands have --exclude "<value>" and --include "<value>" parameters that can achieve the desired result......... When there are multiple filters, the rule is the filters that appear later in the command take precedence over filters that appear earlier in the command. For example, if the filter parameters passed to the command were --exclude "*" --include "*.txt" All files will be excluded from the command except for files ending with .txt

    aws s3 rm --recursive s3://bucket/ --exclude="*" --include="/folder_path/*" 
    
    0 讨论(0)
  • 2021-01-30 00:31

    With s3cmd package installed on a Linux machine, you can do this

    s3cmd rm s3://foo/bar --recursive

    0 讨论(0)
提交回复
热议问题