The following resolved issues allow me to unload, copy, run queries, create tables, etc in Redshift: Redshift create table not working via Python and Unload to S3 with Python using IAM Role credentials. Note that there is no dependency on Boto3 even though I am succesfully writing to and copying from S3 via Redshift.
I would like to be able to upload a file to S3 dynamically, in Python (from cwd)- however I don't seem to find documentation or examples of how it is possible to do using iam_role 'arn:aws:iam:<aws-account-id>:role/<role_name>
rather than access and secret keys as per http://boto3.readthedocs.io/en/latest/guide/quickstart.html.
Any help is greatly appreciated. This is what I have right now, and it throws an error of Unable to locate credentials
:
import boto3
#Input parameters for s3 buckets and s3 credentials
bucket_name = ''
bucket_key = ''
filename_for_csv = 'output.csv'
#Moving file to S3
s3 = boto3.resource('s3')
data = open(filename_for_csv, 'rb')
s3.Bucket(bucket_name).put_object(Key=bucket_key, Body=data, ServerSideEncryption='AES256')