Unless I\'m missing something, it seems that none of the APIs I\'ve looked at will tell you how many objects are in an
. Is ther
You can potentially use Amazon S3 inventory that will give you list of objects in a csv file
The api will return the list in increments of 1000. Check the IsTruncated property to see if there are still more. If there are, you need to make another call and pass the last key that you got as the Marker property on the next call. You would then continue to loop like this until IsTruncated is false.
See this Amazon doc for more info: Iterating Through Multi-Page Results
Here's the boto3 version of the python script embedded above.
import sys
import boto3
s3 = boto3.resource('s3')
s3bucket = s3.Bucket(sys.argv[1])
size = 0
totalCount = 0
for key in s3bucket.objects.all():
totalCount += 1
size += key.size
print('total size:')
print("%.3f GB" % (size*1.0/1024/1024/1024))
print('total count:')
print(totalCount)`
3Hub is discontinued. There's a better solution, you can use Transmit (Mac only), then you just connect to your bucket and choose Show Item Count
from the View
menu.
aws s3 ls s3://mybucket/ --recursive | wc -l
or
aws cloudwatch get-metric-statistics \
--namespace AWS/S3 --metric-name NumberOfObjects \
--dimensions Name=BucketName,Value=BUCKETNAME \
Name=StorageType,Value=AllStorageTypes \
--start-time 2016-11-05T00:00 --end-time 2016-11-05T00:10 \
--period 60 --statistic Average
Note: The above cloudwatch command seems to work for some while not for others. Discussed here: https://forums.aws.amazon.com/thread.jspa?threadID=217050
You can look at cloudwatch's metric section to get approx number of objects stored.
I have approx 50 Million products and it took more than an hour to count using aws s3 ls
There is a --summarize
switch which includes bucket summary information (i.e. number of objects, total size).
Here's the correct answer using AWS cli:
aws s3 ls s3://bucketName/path/ --recursive --summarize | grep "Total Objects:"
Total Objects: 194273
See the documentation