I did create a lambda function which is supposed to upload data into a DynamoDB when an file is upload in a S3 bucket. However, I get a "GetObject operation: permission denied" in CloudWatch when a file is uploaded in the bucket. The lambda function has an IAM role attached, with those policies: AmazonlambdaFullAccess, AmazonS3FullAccess, AmazonCloudWatchLogsFullAccess, AmazonDynamoDBFullAccess. It has lambda.amazonaws.com as trusted entities. The bucket has no policies attached.
import boto3
import json
import urllib
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('wireshark')
s3 = boto3.client('s3')
tests3 = boto3.resource(u's3')
def lambda_handler(event, context):
source_bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.quote_plus(event['Records'][0]['s3']['object']['key'])
copy_source = {'Bucket':source_bucket , 'Key':key}
print(event)
print("Log stream name : ", context.log_stream_name)
print("Log group name : ", context.log_group_name)
print("Request Id:", context.aws_request_id)
print("Mem. limit(MB): ", context.memory_limit_in_mb)
#just print function
print("Log stream name : ", context.log_stream_name)
print("Log group name : ", context.log_group_name)
print("Request Id:", context.aws_request_id)
print("Mem. limit(MB): ", context.memory_limit_in_mb)
try:
print("Using waiter to waiting for object to persist thru s3 service")
waiter = s3.get_waiter('object_exists')
waiter.wait(Bucket=source_bucket, Key=key)
print("Accessing the receied file and reading the same")
bucket = tests3.Bucket(u'awslambdas3test2')
obj = bucket.Object(key=key)
response = obj.get()
print("response from file object")
print(response)
In Cloudwatch : An error occurred (AccessDenied) when calling the GetObject operation: Access Denied. I've been through the "policies simulator" from aws. This IAM role should be able to GetObject from any S3 bucket. Thank you for your help.
Code mostly from GitHub.
try
to find out. The linebucket = tests3.Bucket(u'awslambdas3test2')
is hard-coding the bucket name. It should really usesource_bucket
. Assuming that isn't the problem, you can try testing the function in the Lambda console by using the Amazon S3 Put test event with your actual bucket & key. Also, there should be no need to use a waiter since the object will be available when the function is called. – John Rotensteinaws s3 cp
using the AWS CLI? – John Rotenstein