How to find size of a folder inside an S3 bucket?

ε祈祈猫儿з 提交于 2020-05-13 04:39:14

问题


I am using boto3 module in python to interact with S3 and currently I'm able to get the size of every individual key in an S3 bucket. But my motive is to find the space storage of only the top level folders (every folder is a different project) and we need to charge per project for the space used. I'm able to get the names of the top level folders but not getting any details about the size of the folders in the below implementation. The following is my implementation to get the top level folder names.

import boto
import boto.s3.connection

AWS_ACCESS_KEY_ID = "access_id"
AWS_SECRET_ACCESS_KEY = "secret_access_key"
Bucketname = 'Bucket-name' 

conn = boto.s3.connect_to_region('ap-south-1',
   aws_access_key_id=AWS_ACCESS_KEY_ID,
   aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
   is_secure=True, # uncomment if you are not using ssl
   calling_format = boto.s3.connection.OrdinaryCallingFormat(),
   )

bucket = conn.get_bucket('bucket')
folders = bucket.list("", "/")

for folder in folders:
    print(folder.name)

The type of folder here is boto.s3.prefix.Prefix and it doesn't display any details of size. Is there any way to search a folder/object in an S3 bucket by it's name and then fetch the size of that object ?


回答1:


To find the size of the top-level "folders" in S3 (S3 does not really have a concept of folders, but kind of displays a folder structure in the UI), something like this will work:

from boto3 import client
conn = client('s3')

top_level_folders = dict()

for key in conn.list_objects(Bucket='kitsune-buildtest-production')['Contents']:

    folder = key['Key'].split('/')[0]
    print("Key %s in folder %s. %d bytes" % (key['Key'], folder, key['Size']))

    if folder in top_level_folders:
        top_level_folders[folder] += key['Size']
    else:
        top_level_folders[folder] = key['Size']


for folder, size in top_level_folders.items():
    print("Folder: %s, size: %d" % (folder, size))



回答2:


In order to get the size of an S3 folder, objects (accessible in the boto3.resource('s3').Bucket) provide the method filter(Prefix) which allows you to retrieve ONLY the files which respect the Prefix condition, which makes it quite optimised.

import boto3

def get_size(bucket, path):
    s3 = boto3.resource('s3')
    my_bucket = s3.Bucket(bucket)
    total_size = 0

    for obj in my_bucket.objects.filter(Prefix=path):
        total_size = total_size + obj.size

    return total_size

So let's say you want to get the size of the folder s3://my-bucket/my/path/ then you would call the previous function like that:

get_size("my-bucket", "my/path/")

Then this of course is easily applicable to top level folders as well




回答3:


def find_size(name, conn):
  for bucket in conn.get_all_buckets():
    if name == bucket.name:
      total_bytes = 0
      for key in bucket:
        total_bytes += key.size
        total_bytes = total_bytes/1024/1024/1024
      print total_bytes 



回答4:


Not using boto3, just aws cli, but this quick one-liner serves the purpose. I usually put a tail -1 to get the summary folder size only. Can be a bit slow though, for folders having many objects.

aws s3 ls --summarize --human-readable --recursive s3://bucket-name/folder-name | tail -1



来源:https://stackoverflow.com/questions/49759940/how-to-find-size-of-a-folder-inside-an-s3-bucket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!