How do I delete/count objects in a s3 bucket?

前端 未结 6 726
借酒劲吻你
借酒劲吻你 2021-02-01 10:15

So I know this is a common question but there just doesn\'t seem to be any good answers for it.

I have a bucket with gobs (I have no clue how many) number of files in th

相关标签:
6条回答
  • 2021-02-01 10:44

    "List" won't retrieve the data. I use s3cmd (a python script) and I would have done something like this:

    s3cmd ls s3://foo | awk '{print $4}' | split -a 5 -l 10000 bucketfiles_
    for i in bucketfiles_*; do xargs -n 1 s3cmd rm < $i & done
    

    But first check how many bucketfiles_ files you get. There will be one s3cmd running per file.

    It will take a while, but not days.

    0 讨论(0)
  • 2021-02-01 10:49

    Old thread, but still relevant as I was looking for the answer until I just figured this out. I wanted a file count using a GUI-based tool (i.e. no code). I happen to already use a tool called 3Hub for drag & drop transfers to and from S3. I wanted to know how many files I had in a particular bucket (I don't think billing breaks it down by buckets).

    So, using 3Hub, 
    - list the contents of the bucket (looks basically like a finder or explorer window)
    - go to the bottom of the list, click 'show all'
    - select all (ctrl+a)
    - choose copy URLs from right-click menu
    - paste the list into a text file (I use TextWrangler for Mac) 
    - look at the line count  
    

    I had 20521 files in the bucket and did the file count in less than a minute.

    I'd like to know if anyone's found a better way since this would take some time on hundreds of thousands of files.

    0 讨论(0)
  • 2021-02-01 10:52

    To count objects in an S3 bucket:

    Go to AWS Billing, then reports, then AWS Usage reports. Select Amazon Simple Storage Service, then Operation StandardStorage. Download a CSV file that includes a UsageType of StorageObjectCount that lists the item count for each bucket.

    0 讨论(0)
  • 2021-02-01 10:59

    I've had the same problem with deleting hundreds of thousands of files from a bucket. It may be worthwhile to fire up an EC2 instance to run the parallel delete because the latency to S3 is low. I think there's some money to be made hosting a bunch of EC2 servers and charging people to delete buckets quickly. (At least until Amazon gets around to changing the API)

    0 讨论(0)
  • 2021-02-01 11:02

    I am most certainly not one of those 'guys do who have boasted about hosting millions of images/txts', as I only have a few thousand, and this may not be the answer you are looking for, but I looked at this a while back.

    From what I remember, there is an API command called HEAD which gets information about an object rather than retrieving the complete object which is what GET does, which may help in counting the objects.

    As far as deleting Buckets, at the time I was looking, the API definitely stated that the bucket had to be empty, so you need to delete all the objects first.

    But, I never used either of these commands, because I was using S3 as a backup and in the end I wrote a few routines that uploaded the files I wanted to S3 (so that part was automated), but never bothered with the restore/delete/file management side of the equation. For that use Bucket Explorer which did all I need. In my case, it wasn't worth spending time when for $50 I can get a program that does all I need. There are probably others that do the same (eg CloudBerry)

    In your case, with Bucket Explorer, you can right click on a bucket and select delete or right click and select properties and it will count the number of objects and the size they take up. It certainly does not download the whole object. (Eg the last bucket I looked it was 12Gb and around 500 files and it would take hours to download 12GB whereas the size and count is returned in a second or two). And if there is a limit, then it certainly isn't 1000.

    Hope this helps.

    0 讨论(0)
  • 2021-02-01 11:03

    1) Regarding your first question, you can list the items on a bucket without actually retrieving them. You can do that both with the SOAP and the REST API. As you can see, you can define the maximum number of items to list and the position to start the listing from (the marker). Read more about it here.

    I do not know of any implementation of the paging, but especially for the REST interface it would be very easy to implement it in any language.

    2) I believe the only way to delete a bucket is to first empty it from all items. See alse this question.

    3) I would say that S3 is very well suited for storing a large number of files. It depends however on what you want to do. Do you plan to also store binary files? Do you need to perform any queries or just listing the files is enough?

    0 讨论(0)
提交回复
热议问题