0
votes

I have a file of around 16mb in size and am using python boto3 upload_file api to upload this file into the S3 bucket. However, I believe the API is internally choosing multipart upload and gives me an "Anonymous users cannot initiate multipart upload" error.

In some of the runs of the application, the file generated may be smaller (few KBs) in size. What's the best way to handle this scenario in general or fix the error I mentioned above?

I currently have a Django application that generates a file when run and uploads this file directly into an S3 bucket.

1
How is the Django application authenticating to AWS? Or is it simply using anonymous permissions granted via Bucket Policy? How are you restricting the ability to for anyone to upload to the bucket? Do you have a backend app that could generate a pre-signed URL?John Rotenstein
It was anonymous permissions on the bucket. I know this is a very bad way to go about it!krr

1 Answers

2
votes

Ok, so unless you've opened your S3 bucket up for the world to upload to (which is very much NOT recommended), it sounds like you need to setup the permissions for access to your S3 bucket correctly.

How to do that will vary a little depending on how you're running this application - so let's cover off a few options - in all cases you will need to do two things:

  • Associate your script with an IAM Principal (an IAM User or an IAM Role depending on where / how this script is being run).
  • Add permissions for that principal to access the bucket (this can be accomplished either through an IAM Policy, or via the S3 Bucket Policy)

Lambda Function - You'll need to create an IAM Role for your application and associate it with your Lambda function. Boto3 should be able to assume this role transparently for you once configured.

EC2 Instance or ECS Task - You'll need to create an IAM Role for your application and associate it with your EC2 instance/ECS Task. Boto3 will be able to access the credentials for the role via instance metadata and should automatically assume the role.

Local Workstation Script - If you're running this script from your local workstation, then boto3 should be able to find and use the credentials you've setup for the AWS CLI. If those aren't the credentials you want to use you'll need to generate an access key and secret access key (be careful how you secure these if you go this route, and definitely follow least privilege).

Now, once you've got your principal you can either attach an IAM policy that grants Allow permissions to upload to the bucket to the IAM User or Role, or you can add a clause to the Bucket Policy that grants that IAM User or Role access. You only need to do one of these.

Multi-part uploads are performed via the same S3:PutObject call as single part uploads (though if your files are small I'd be surprised it was using multi-part for them). If you're using KMS one small trick to be aware of is that you need permission to use the KMS key for both Encrypt and Decrypt permissions if encrypting a multi-part upload.