Quick way to list all files in Amazon S3 bucket?

前端 未结 28 1740
星月不相逢
星月不相逢 2020-11-28 01:32

I have an amazon s3 bucket that has tens of thousands of filenames in it. What\'s the easiest way to get a text file that lists all the filenames in the bucket?

相关标签:
28条回答
  • 2020-11-28 02:03

    For Python's boto3 after having used aws configure:

    import boto3
    s3 = boto3.resource('s3')
    
    bucket = s3.Bucket('name')
    for obj in bucket.objects.all():
        print(obj.key)
    
    0 讨论(0)
  • 2020-11-28 02:04

    Update 15-02-2019:

    This command will give you a list of all buckets in AWS S3:

    aws s3 ls

    This command will give you a list of all top-level objects inside an AWS S3 bucket:

    aws s3 ls bucket-name

    This command will give you a list of ALL objects inside an AWS S3 bucket:

    aws s3 ls bucket-name --recursive

    This command will place a list of ALL inside an AWS S3 bucket... inside a text file in your current directory:

    aws s3 ls bucket-name --recursive | cat >> file-name.txt

    0 讨论(0)
  • 2020-11-28 02:05

    s3cmd is invaluable for this kind of thing

    $ s3cmd ls -r s3://yourbucket/ | awk '{print $4}' > objects_in_bucket

    0 讨论(0)
  • 2020-11-28 02:05
    # find like file listing for s3 files
    aws s3api --profile <<profile-name>> \
    --endpoint-url=<<end-point-url>> list-objects \
    --bucket <<bucket-name>> --query 'Contents[].{Key: Key}'
    
    0 讨论(0)
提交回复
热议问题