I have a lamba function to copy objects from bucket \'A\' to bucket \'B\', and everything was working fine, until and object with name \'New Text Document.txt\' was created
in ASP.Net has UrlDecode. The sample is below.
HttpUtility.UrlDecode(s3entity.Object.Key, Encoding.UTF8)
Since we are sharing for other runtimes here is how to do it in NodeJS:
const srcKey = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, " "));
From the AWS docs here
What I have done to fix this is
java.net.URLDecoder.decode(b.getS3().getObject().getKey(), "UTF-8")
{
"Records": [
{
"s3": {
"object": {
"key": "New+Text+Document.txt"
}
}
}
]
}
So now the JSon value, "New+Text+Document.txt" gets converted to New Text Document.txt, correctly.
This has fixed my issue, please suggest if this is very correct solution. Will there be any corner case that can break my implementation.
I think in Java you should use:
getS3().getObject().getUrlDecodedKey()
method that returns decoded key, instead of
getS3().getObject().getKey()
Agree with Scott. for me create object event was appending %3 for semicolon : i have to replace it twice to get correct s3 url
Python code:
def lambda_handler(event, context):
logger.info('Event: %s' % json.dumps(event))
source_bucket = event['Records'][0]['s3']['bucket']['name']
key_old = event['Records'][0]['s3']['object']['key']
key_new = key_old.replace('%3',':')
key = key_new.replace(':A',':')
logger.info('key value')
logger.info(key)
I was facing same issue with special character's, as aws S3 event replacing the special character's as in UrlEncoding. So to resolve the same I have used aws decode API "SdkHttpUtils.urlDecode(String key)" to decode the object key. Hence worked as expected.
You can check below link to get more details about SdkHttpUtils API.
https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/util/SdkHttpUtils.html#urlDecode-java.lang.String-