0
votes

I've created a user with a policy having full access to S3: enter image description here

When I set credentials (~/.aws/credentials) and try to push a file to my bucket using boto3, it returns An error occurred (AccessDenied) when calling the PutObject operation: Access Denied

Any idea why is this happening?

I've even tried generating an access key id / secret access key for the root account and it returns the same error. I've run my code with another account's credentials and it works fine, so the issue is within the account itself.

EDIT:

I verified that the caller ID is the same as my account using boto3.client('sts').get_caller_identity().get('Account').

I don't have a policy set for my bucket, and these are its permissions enter image description here

This is a snippet of my class

from boto3 import client
from botocore.exceptions import ClientError
from boto3.exceptions import S3UploadFailedError

class AmazonS3(object):
    s3 = client('s3')

@classmethod
def upload_image(cls, bucket_name, object_name, file_content):
    extra_args = {"ACL": "public-read",
                  "ContentType": "image/jpeg" if object_name.split('.')[-1] in ['jpg', 'jpeg'] else "image/png",
                  "ContentDisposition": "inline",
                  "ContentEncoding": "base64"}

    try:
        cls.s3.put_object(Body=file_content, Bucket=bucket_name, Key=object_name, **extra_args)
    except (ClientError, S3UploadFailedError, Exception) as e:
        raise Exception('There was an error when uploading the image')
1
Try aws sts get-caller-identity and ensure the call is made by the right principal. Is there a bucket policy on the bucket? Please include any policies you set up in JSON form.Maurice
Does it work if you try to copy the file using the AWS CLI?John Rotenstein
@JohnRotenstein YesMehdi Khlifi
If it works from the AWS CLI but not from boto3 and both of those are using the same credentials, then the error probably lies in your Python code. Can you include a minimally-reproducible example in your Question?John Rotenstein
@JohnRotenstein I've edited my question. I should note that this code has been working for months in another environment, I test any credentials locally by setting them in my docker-compose file. Old credentials work, new ones unfortunately don't. But I've tried aws s3 cp inside the container and it's workingMehdi Khlifi

1 Answers

0
votes

I found the issue. In my code I'm getting the bucket name from an environment variable like this os.getenv('AWS_BUCKET_NAME', 'X').

In my deployment I've set BUCKET_NAME='Y' as environment variable, so it made sense to be able to list my S3 buckets and get 'Y' in the output and not be able to upload a file, to 'X' not 'Y' since AWS_BUCKET_NAME was not set during the deployment.

But the error An error occurred (AccessDenied) when calling the PutObject operation: Access Denied is misleading and is stating that I don't have the right permissions instead of stating that the bucket may not exist.