How can I copy files bigger than 5 GB in Amazon S3?

后端 未结 4 1906
执笔经年
执笔经年 2021-02-01 17:59

Amazon S3 REST API documentation says there\'s a size limit of 5gb for upload in a PUT operation. Files bigger than that have to be uploaded using multipart. Fine.

Howev

4条回答
  •  臣服心动
    2021-02-01 18:59

    I found this method to upload files bigger than 5gigs and modified it to work with a Boto copy procedure. here's the original: http://boto.cloudhackers.com/en/latest/s3_tut.html

    import math
    from boto.s3.connection import S3Connection
    from boto.exception import S3ResponseError
    
    
    conn = S3Connection(host=[your_host], aws_access_key_id=[your_access_key],
                        aws_secret_access_key=[your_secret_access_key])
    
    from_bucket = conn.get_bucket('your_from_bucket_name')
    key = from_bucket.lookup('my_key_name')
    dest_bucket = conn.get_bucket('your_to_bucket_name')
    
    total_bytes = key.size
    bytes_per_chunk = 500000000
    
    chunks_count = int(math.ceil(total_bytes/float(bytes_per_chunk)))
    file_upload = dest_bucket.initiate_multipart_upload(key.name)
    for i in range(chunks_count):
        offset = i * bytes_per_chunk
        remaining_bytes = total_bytes - offset
        print(str(remaining_bytes))
        next_byte_chunk = min([bytes_per_chunk, remaining_bytes])
        part_number = i + 1
        file_upload.copy_part_from_key(dest_bucket.name, key.name, part_number,
                                       offset, offset + next_byte_chunk - 1)
    file_upload.complete_upload()
    

提交回复
热议问题