Amazon S3 creating unique keys for every object

后端 未结 2 1889
死守一世寂寞
死守一世寂寞 2021-01-18 01:30

My app users upload their files to one bucket. How can I ensure that each object in my S3 bucket has a unique key to prevent objects from being overwritten?

At the m

相关标签:
2条回答
  • 2021-01-18 01:57

    Checking for a file with that name before uploading it would work.

    If the file already exists, re-randomize the file name, and try again.

    0 讨论(0)
  • 2021-01-18 02:10

    Are you encrypting, or hashing? If you are using md5 or sha1 hashes, an attacker could easily find a hash collision and make you slip on a banana skin. If you are encrypting without a random initialization vector, an attacker might be able to deduce your key after uploading a few hundred files, and encryption is probably not the best approach. It is computationally expensive, difficult to implement, and you can get a safer mechanism for this job with less effort.

    If you prepend a random string to each filename, using a reasonably reliable source of entropy, you shouldn’t have any issues, but you should check whether the file already exists anyway. Although coding a loop to check, using S3::GetObject, and generate a new random string might seem like a lot of effort for something that will almost never need to run, "almost never" means it has a high probability of happening eventually.

    0 讨论(0)
提交回复
热议问题