Unless I\'m missing something, it seems that none of the APIs I\'ve looked at will tell you how many objects are in an
. Is ther
From the command line in AWS CLI, use ls plus --summarize
. It will give you the list of all of your items and the total number of documents in a particular bucket. I have not tried this with buckets containing sub-buckets:
aws s3 ls "s3://MyBucket" --summarize
It make take a bit long (it took listing my 16+K documents about 4 minutes), but it's faster than counting 1K at a time.
None of the APIs will give you a count because there really isn't any Amazon specific API to do that. You have to just run a list-contents and count the number of results that are returned.
You can use AWS cloudwatch metrics for s3 to see exact count for each bucket.
One of the simplest ways to count number of objects in s3 is:
Step 1: Select root folder
Step 2: Click on Actions -> Delete (obviously, be careful - don't delete it)
Step 3: Wait for a few mins aws will show you number of objects and its total size.
I used the python script from scalablelogic.com (adding in the count logging). Worked great.
#!/usr/local/bin/python
import sys
from boto.s3.connection import S3Connection
s3bucket = S3Connection().get_bucket(sys.argv[1])
size = 0
totalCount = 0
for key in s3bucket.list():
totalCount += 1
size += key.size
print 'total size:'
print "%.3f GB" % (size*1.0/1024/1024/1024)
print 'total count:'
print totalCount
The easiest way is to use the developer console, for example, if you are on chrome, choose Developer Tools, and you can see following, you can either find and count or do some match, like 280-279 + 1 = 2
...