1
votes

The following resolved issues allow me to unload, copy, run queries, create tables, etc in Redshift: Redshift create table not working via Python and Unload to S3 with Python using IAM Role credentials. Note that there is no dependency on Boto3 even though I am succesfully writing to and copying from S3 via Redshift.

I would like to be able to upload a file to S3 dynamically, in Python (from cwd)- however I don't seem to find documentation or examples of how it is possible to do using iam_role 'arn:aws:iam:<aws-account-id>:role/<role_name> rather than access and secret keys as per http://boto3.readthedocs.io/en/latest/guide/quickstart.html.

Any help is greatly appreciated. This is what I have right now, and it throws an error of Unable to locate credentials:

import boto3

#Input parameters for s3 buckets and s3 credentials
bucket_name = ''
bucket_key = ''
filename_for_csv = 'output.csv'    

#Moving file to S3
s3 = boto3.resource('s3')
data = open(filename_for_csv, 'rb')
s3.Bucket(bucket_name).put_object(Key=bucket_key, Body=data, ServerSideEncryption='AES256')
2

2 Answers

2
votes

You will need AWS IAM Access Keys.

The issue for you is that you need access keys in order to call STS (Security Token Service) which then can process AssumeRole() with your role ARN which then generates new temporary access keys.

However, if you have access keys then you do not need to use AssumeRole().

If your machine is outside of AWS, then you will need to use access keys, or an authentication / authorization service like Cognito.

IAM Roles are designed for services, such as Redshift, EC2, etc which have permission to call STS with your role ARN to generate new temporary access keys. Roles are not designed to be called outside of AWS (there are exceptions, such as Cognito).

[Edit after new comment]

You have several solutions:

  • Signed URLs. Assign the role to EC2. Then have EC2 create signed URLs that you can use locally to upload files to S3. This keeps the access keys off your system.
  • Use Cognito. Cognito is easy to work with and there are lots of code examples on the Internet. Cognito will provide authentication, authorization and temporary credentials for you.
  • Assign your role to EC2 so that EC2 can upload to S3. Then you have the issue of getting the file to EC2 and paying for the extra bandwidth (EC2 -> S3). You can use SSH and SCP to copy files securely to EC2 and then launch a process to copy to S3.
1
votes

If you are running this script from an EC2 instance, attach an IAM role to the instance. The IAM role should contain the following policy (in addition to what you already have).

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "statement1",
            "Effect": "Allow",
            "Action":   ["s3:PutObject"],
            "Resource": "arn:aws:s3:::examplebucket/*"
        }
    ]
}

If you are not running this script in an EC2 instance, you need to use the access and secret keys.