There must be an easy way to get the file size (key size) without pulling over a whole file. I can see it in the Properties of the AWS S3 browser. And I think I can get it
This would work:
bk = conn.get_bucket('my_bucket_name')
key = bk.lookup('my_key_name')
print key.size
The lookup method simply does a HEAD request on the bucket for the keyname so it will return all of the headers (including content-length) for the key but will not transfer any of the actual content of the key.
The S3 tutorial mentions this but not very explicitly and not in this exact context. I'll add a section on this to help make it easier to find.
Note: for every old link like http://boto.cloudhackers.com/s3_tut.html
that returns a 404, add in "/en/latest"
right after the ".com"
: http://boto.cloudhackers.com/en/latest/s3_tut.html
. (Someone needs to explore mod_rewrite...)
in boto3
using an S3 resource:
boto3.resource('s3').Bucket(bucketname).Object(keyname).content_length
The head_object
call of the S3 client returned me an http "403 Forbidden"
You can also get a list of all objects if multiple files need to be checked. For a given bucket run list_objects_v2
and then iterate through response 'Contents'. For example:
s3_client = boto3.client('s3')
response_contents = s3_client.list_objects_v2(
Bucket='name_of_bucket'
).get('Contents')
you'll get a list of dictionaries like this:
[{'Key': 'path/to/object1', 'LastModified': datetime, 'ETag': '"some etag"', 'Size': 2600, 'StorageClass': 'STANDARD'}, {'Key': 'path/to/object2', 'LastModified': 'datetime', 'ETag': '"some etag"', 'Size': 454, 'StorageClass': 'STANDARD'}, ... ]
Notice that each dictionary in the list contains 'Size' key, which is the size of your particular object. It's iterable
for rc in response_contents:
print(f"Size: {rc.get('Size')}")
You get sizes for all files you might be interested in:
Size: 2600
Size: 454
Size: 2600
...
in boto3:
s3.head_object
also performs a HEAD request to retrieve the meta data about the object:
s3 = boto3.client('s3')
response = s3.head_object(Bucket='bucketname', Key='keyname')
size = response['ContentLength']
In Boto 3:
Using S3 Object
you can fetch the file (a.k.a object) size in bytes. It is a resource representing the Amazon S3 Object.
In fact you can get all metadata related to the object. Like content_length
the object size, content_language
language the content is in, content_encoding
, last_modified
, etc.
import boto3
s3 = boto3.resource('s3')
object = s3.Object('bucket_name','key')
file_size = object.content_length #size in bytes
Reference boto3 doc