0
votes

SITUATION

I'm using a Lambda function that takes a CSV attachment from an incoming email and places it into what is, in effect, a sub-folder of an S3 bucket. This part of the Lambda works well, however there are other UDFs which I need to execute, within the same Lambda function, to perform susequent tasks.

CODE

    import boto3 
    
    import email
    import base64
    
    import math
    import pickle
    
    import numpy as np
    import pandas as pd
    
    import io 
    
    
    ###############################
    ###    GET THE ATTACHMENT   ###
    ###############################
    
    #s3 = boto3.client('s3')
    
    
    FILE_MIMETYPE = 'text/csv'
    #'application/octet-stream'
    
    # destination folder
    S3_OUTPUT_BUCKETNAME = 'my_bucket' 
    
    print('Loading function')
    
    s3 = boto3.client('s3')
    
    
    def lambda_handler(event, context):
    
        #source email bucket 
        inBucket = event['Records'][0]['s3']['bucket']['name']
        key = urllib.parse.quote(event['Records'][0]['s3']['object']['key'].encode('utf8'))
    
    
        try:
            response = s3.get_object(Bucket=inBucket, Key=key)
            msg = email.message_from_string(response['Body'].read().decode('utf-8'))   
    
        except Exception as e:
            print(e)
            print('Error retrieving object {} from source bucket {}. Verify existence and ensure bucket is in same region as function.'.format(key, inBucket))
            raise e
        
    
        attachment_list = []
       
    
        try:
            #scan each part of email 
            for message in msg.walk():
                
                # Check filename and email MIME type
                if  (message.get_content_type() == FILE_MIMETYPE and message.get_filename() != None):
                    attachment_list.append ({'original_msg_key':key, 'attachment_filename':message.get_filename(), 'body': base64.b64decode(message.get_payload()) })
        except Exception as e:
            print(e)
            print ('Error processing email for CSV attachments')
            raise e
        
        # if multiple attachments send all to bucket 
        for attachment in attachment_list:
    
            try:
                s3.put_object(Bucket=S3_OUTPUT_BUCKETNAME, Key='attachments/' + attachment['original_msg_key'] + '-' + attachment['attachment_filename'] , Body=attachment['body']
            )
            except Exception as e:
                print(e)
                print ('Error sending object {} to destination bucket {}. Verify existence and ensure bucket is in same region as function.'.format(attachment['attachment_filename'], S3_OUTPUT_BUCKETNAME))
                raise e

#################################
###    ADDITIONAL FUNCTIONS   ###
#################################
    
    def my_function():
      print("Hello, this is another function")

OUTCOME

The CSV attachment is successfully retrieved and placed in the destination as specified by s3.put_object, however there is no evidence in the Cloudwatch logs that my_function runs.

WHAT I HAVE TRIED

I've tried using def my_function(event, context): in an attempt to ascertain whether the function requires the same criteria to be executed as the first functon. I've also tried to include the my_function() as part of the first function but this does not appear to work either.

How can I ensure that both functions are executed within the Lambda?

1
You are not invoking your my_function at all. Just invoke it in your lambda_handler at the end of the handler.Marcin
The Lambda service invokes the Python function that you configured when you created/updated the Lambda function. It doesn't magically call 'all functions with a given argument signature'.jarmod
I mean add my_function() in your lambda_handler where you want to invoke your function.Marcin
Write code inside your lambda_handler(...) function that calls my_function(). Just like you would do in regular Python.jarmod
I can confirm that the function has been executed due to the presence of the function output in the Cloudwatch logsjimiclapton

1 Answers

1
votes

Based on the comments.

The issue was caused because my_function function was not called inside the lambda handler.

The solution was to add my_function() into the handler lambda_handler so that the my_function is actually called.