5
votes

I have some Python that can request a presigned POST URL to upload an object into an S3 bucket. It works running it locally, under my IAM user with Admin abilities, and I can upload things to the bucket using Postman and cURL. However, when trying to run the same code in Lambda, it says "The AWS Access Key Id you provided does not exist in our records.".

The only difference is that the Lambda function runs without Admin-rights (but it does have a policy that allows it to run any S3 action on the bucket) and is using a different (older) version of Boto3.

This is the code I'm trying to use: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-presigned-urls.html#generating-a-presigned-url-to-upload-a-file

I've tried to use the details returned from the Lambda function in exactly the same way as I'm using the details returned locally, but the Lambda details don't work.

6
So, your Lambda function is configured with an IAM role whose policy includes s3:* on the bucket in question. Can your Lambda function list objects in that bucket, or get an object from that bucket (or do those also fail with an error)? - jarmod
Correct, my Lambda function is configured to use an IAM role with a policy that includes s3:* on the bucket I’m trying to generate a presigned URL for. I’ve just tested it and it can list all the objects in that bucket too. - Brewmeister
Any chance that the pre-signed URL that your client is ultimately using for the POST was corrupted? - jarmod
Do you find out a solution? I've run into the same issue py/.net lambdas, locally they all work perfectly... - Pavlo Datsiuk

6 Answers

3
votes

Here is 100% workable solution of AWS lambda

  1. Attach policy AmazonS3FullAccess

  2. Do not use multipart/form-data upload

  3. Configure S3 CORS

  4. Use next python code

     import uuid
     import boto3
    
     def lambda_handler(event, context):
    
         s3 = boto3.client('s3')
    
         upload_key = 'myfile.pdf'
         download_key = 'myfile.pdf'
    
         bucket = 'mys3storage'
    
         # Generate the presigned URL for download
         presigned_download_url = s3.generate_presigned_url(
             ClientMethod='get_object',
             Params={
                 'Bucket': bucket,
                 'Key': download_key,
                 'Expires': 3600
             }
         )
    
         # Generate the presigned URL for upload
         presigned_upload_url = s3.generate_presigned_url(
             ClientMethod='put_object',
             Params={
                 'Bucket': bucket,
                 'Key': upload_key,
                 'ContentType': 'application/pdf',
                 'Expires': 3600
             }
         )
    
         # return the result
         return {
             "upload_url": presigned_upload_url
             "download_url": download_url
         }
    
2
votes

Need to post x-amz-security-token value, when u use role

1
votes

I had the same issue and it was driving me crazy. Locally all went smooth and once deployed into lambda I got 403 either using create_presigned_post or create_presigned_url.

Turns out the role the lambda is using was a different one than my local aws user is having. (The lambda role was automatically created with AWS SAM in my case) After granting the lambda role S3 permissions, the error was resolved.

0
votes

Good question. You didn't describe how you are getting your credentials to the Lambda function. Your code, specifically this:

s3_client = boto3.client('s3')

expects to find default credentials using the ~/.aws/credentials file. You won't (nor should you) have that file in your Lambda execution environment, but you probably have it in your local environment. I suspect you are not getting your credentials to the Lambda function at all.

There are two options to use in Lambda in get the credentials in place.

  1. don't use credentials, but use an IAM role for the Lambda function that provides the access to S3 required. This is the best practice. If you do this, you won't need the credentials. This is Best Practice.
  2. set the credentials as environment variables for your lambda function. You can directly define AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, and then the code above should just pick those up and use them.
0
votes

The Official python tutorial for this did not mention the x-amz-security-token with the use of lambda functions, however this needs to included as a form value when uploading a file to S3. So to recap, when using lambda, make sure the role attached to the function has s3 access, and the extra form field is present with the x-amz-security-token value.

    <form action="URL HERE" method="post" enctype="multipart/form-data">
      <input type="hidden" name="key" value="KEY HERE" />
      <input type="hidden" name="AWSAccessKeyId" value="ACCESS KEY HERE" />

 <!-- ADD THIS ONE -->
  <input type="hidden" name="x-amz-security-token" value="SECURITY TOKEN HERE" />
 <!-- ADD THIS ONE -->
  
    <input type="hidden" name="policy" value="POLICY HERE" />
      <input type="hidden" name="signature" value="SIGNATURE HERE" />
    File:
      <input type="file"   name="file" /> <br />
      <input type="submit" name="submit" value="Upload to Amazon S3" />
    </form>
-1
votes

You can try below code to generate pre-signed URL for an object

import json
import boto3
from botocore.exceptions import ClientError

s3 = boto3.client('s3')
bucket = 'test1'
download_key = 'path/to/Object.txt'

def lambda_handler(event, context):
    try:
        response = s3.generate_presigned_url('get_object',Params={'Bucket': bucket,'Key': download_key},ExpiresIn=3600)
    except ClientError as e:
        logging.error(e)
        return None
    url = response
    print(url)
    return{
        'url' : url
    }