reading files triggered by s3 event

前端 未结 1 1547
情深已故
情深已故 2021-02-04 07:26

Here is what i want to do :

  1. User uploads a csv file onto AWS S3 bucket.
  2. Upon file uploaded, S3 bucket invokes the lambda function that i have created.
相关标签:
1条回答
  • 2021-02-04 08:04

    I suggest to add to your IAM policy also the access to Cloudwatch. Actually your lambda function is not returning anything, but you can see your log output in Cloudwatch. I really recommend to use logger.info(message) instead of print when you are setting up logger.

    I hope that this helps to debug your function.

    Except the part of sending, this is how I will rewrite it (just tested in the AWS console):

    import logging
    import boto3
    
    logger = logging.getLogger()
    logger.setLevel(logging.INFO)
    
    s3 = boto3.client('s3')
    
    def lambda_handler(event, context):
        email_content = ''
    
        # retrieve bucket name and file_key from the S3 event
        bucket_name = event['Records'][0]['s3']['bucket']['name']
        file_key = event['Records'][0]['s3']['object']['key']
        logger.info('Reading {} from {}'.format(file_key, bucket_name))
        # get the object
        obj = s3.get_object(Bucket=bucket_name, Key=file_key)
        # get lines inside the csv
        lines = obj['Body'].read().split(b'\n')
        for r in lines:
           logger.info(r.decode())
           email_content = email_content + '\n' + r.decode()
        logger.info(email_content)
    
    0 讨论(0)
提交回复
热议问题