0
votes

I did create a lambda function which is supposed to upload data into a DynamoDB when an file is upload in a S3 bucket. However, I get a "GetObject operation: permission denied" in CloudWatch when a file is uploaded in the bucket. The lambda function has an IAM role attached, with those policies: AmazonlambdaFullAccess, AmazonS3FullAccess, AmazonCloudWatchLogsFullAccess, AmazonDynamoDBFullAccess. It has lambda.amazonaws.com as trusted entities. The bucket has no policies attached.

 import boto3
 import json
 import urllib

 dynamodb = boto3.resource('dynamodb')
 table = dynamodb.Table('wireshark')
 s3 = boto3.client('s3')
 tests3 = boto3.resource(u's3')

 def lambda_handler(event, context):

     source_bucket = event['Records'][0]['s3']['bucket']['name']
     key = urllib.parse.quote_plus(event['Records'][0]['s3']['object']['key'])
    copy_source = {'Bucket':source_bucket , 'Key':key}
    print(event)
    print("Log stream name : ", context.log_stream_name)
    print("Log group name : ", context.log_group_name)
    print("Request Id:", context.aws_request_id)
    print("Mem. limit(MB): ", context.memory_limit_in_mb)

    #just print function
    print("Log stream name : ", context.log_stream_name)
    print("Log group name : ", context.log_group_name)
    print("Request Id:", context.aws_request_id)
    print("Mem. limit(MB): ", context.memory_limit_in_mb)

    try:
        print("Using waiter to waiting for object to persist thru s3 service")
        waiter = s3.get_waiter('object_exists')
        waiter.wait(Bucket=source_bucket, Key=key)
        print("Accessing the receied file and reading the same")
        bucket = tests3.Bucket(u'awslambdas3test2')
        obj = bucket.Object(key=key)
        response = obj.get()
        print("response from file object")
        print(response)

In Cloudwatch : An error occurred (AccessDenied) when calling the GetObject operation: Access Denied. I've been through the "policies simulator" from aws. This IAM role should be able to GetObject from any S3 bucket. Thank you for your help.

Code mostly from GitHub.

1
Which specific line is generating the error? You might need to remove the try to find out. The line bucket = tests3.Bucket(u'awslambdas3test2') is hard-coding the bucket name. It should really use source_bucket. Assuming that isn't the problem, you can try testing the function in the Lambda console by using the Amazon S3 Put test event with your actual bucket & key. Also, there should be no need to use a waiter since the object will be available when the function is called.John Rotenstein
The line generating the error is response=obj.get. The thing is, it is not a problem with the file's name, as I did print the variable 'key' and it print the name of the file I want to access. I did a Amazon S3 Put test, I got a permission denied. As I stated above, the lambda function has full access to S3 and the bucket has not any policies, so I don't get why the permission is refused.Baptise
It is possible that objects placed in an Amazon S3 bucket are not accessible by the bucket owner, especially if the object was copied from another account and retained the same permissions. Can you access the file via aws s3 cp using the AWS CLI?John Rotenstein
Yes, i've been able to copy the file from the bucket to my local storage with the AWS CLI. I've tried to create another lambda with another IAM role, still does not work.Baptise

1 Answers

1
votes

Here is an AWS Lambda function that will print the contents of the file:

import boto3
import os

def lambda_handler(event, context):

    s3_client = boto3.client('s3')

    # For each record

    for record in event['Records']:

        # Get Bucket and Key
        bucket = record['s3']['bucket']['name']
        key    = record['s3']['object']['key']

        # Print the bucket & key to the logs
        print(bucket, key)

        # Download object
        local_filename = '/tmp/' + key
        s3_client.download_file(bucket, key, local_filename)

        # Print contents to log (just to demonstrate concept)
        for line in open(local_filename):
            print(line)

        # Delete file when done, to clear space for future execution
        os.remove(local_filename)

Create an Amazon S3 event on a bucket to trigger this Lambda function and it will print the filename and the contents of the file to CloudWatch Logs. This should be a good test to determine whether the program is with your code or with permissions.