Is there a way for a Lambda function to be triggered by multiple S3 buckets?

我是研究僧i 提交于 2021-02-08 15:13:26

问题


I'm trying to create a Lambda function that will be triggered by any change made to any bucket in the S3 console. Is there a way to tie all create events from every bucket in S3 to my Lambda function?

It appears that in the creation of a Lambda function, you can only select one S3 bucket. Is there a way to do this programmatically, if not in the Lambda console?


回答1:


There is at least one way: you can setup an s3 event notifications, for each bucket you want to monitor, all pointing to a single SQS queue.

That SQS queue can then be the event source for your lambda function.




回答2:


If you are using any aws-sdk to upload to s3 there is a workaround by setting up an API gateway endpoint to trigger lambda whenever the upload to s3 succeeded. passing the bucket-name & object-key to lambda you may also specify the dest bucket dynamically.

This also will be helpful with nested prefixes. e.g. bucket/users/avatars/user1.jpg bucket/users/avatars/thumbnails/user1-thumb.jpg




回答3:


Yes you can, assume that you only want to trigger Lambda if there're new created objects in a few buckets, you can do it via AWS Console, cli, boto3 & other SDK.

If over time there're new bucket created & you also want to add it as event source for Lambda, you can create a Cloudtrail API event source to trigger another Lambda to programmaticallyy dd these new buckets as event sources for the original Lambda.



来源:https://stackoverflow.com/questions/54556599/is-there-a-way-for-a-lambda-function-to-be-triggered-by-multiple-s3-buckets

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!