Trigger S3 create event

后端 未结 3 1581
半阙折子戏
半阙折子戏 2021-02-12 17:45

I use S3 Create events to trigger AWS-Lambdas. If my processing fails I want to do some magic and then trigger the \"event\" again to start my processing ones more. So far the o

相关标签:
3条回答
  • 2021-02-12 18:16

    It is not possible to have an S3 Event trigger again without uploading the file again. However, for a failed processing event if you are using Lambda it will automatically be retried 3 times per the FAQ:

    For Amazon S3 bucket notifications and custom events, AWS Lambda will attempt execution of your function three times in the event of an error condition in your code or if you exceed a service or resource limit.

    If your processing is failing and you want to have more control over the retry you could instead use SQS to receive the S3 Events. That way your application is able to read messages off the queue and if the processing failes/dies the visibility timeout will eventually be reached and the SQS message can be processed again. This way you can retry indefinitely and also control the visibility timeout period between successive retries.

    If you are using Lambda and want to use SQS in combination, this is still possible by scheduling a Lambda function to run every 5 minutes and have that Lambda function read messages off of the queue. Combine this with the 5 minute limit for a Lambda functions run time you can nearly continuously consume messages off of an SQS queue.

    0 讨论(0)
  • 2021-02-12 18:20

    One method that is not mention here is that you can "touch" the metadata of the S3 object and it will trigger an event. This way you can get the event message without having to modify or fiddle with the original object data.

    Note: the data in the metadata fields do not have to change to trigger the event.

    Some strategies here:

    • Use a common metadata tag which can be used for triggering event
    • Get the metadata dictionary first then post it back with the same data
    0 讨论(0)
  • 2021-02-12 18:21

    I came across a similar situation today where I needed to re-trigger a Lamda function after the file was already in S3. A co-worker of mine came up with the following that worked for us:

    1. Install the AWS cli tool
    2. Execute something like so:

      aws lambda invoke
          --function-name <lambda function name>
          --payload '{
              "Records":[{
                  "s3":{
                      "bucket":{
                          "name":"<bucket name>"
                      },
                      "object":{
                          "key": "<key name>"
                      }
                  }
              }]
          }' outfile
      
    0 讨论(0)
提交回复
热议问题