I am deploying my first batch job on AWS. When I run my docker image in an EC2 instance, the script called by the job runs fine. I have assigned an IAM role to this instance to allow S3 access.
But when I run the same script as a job on AWS Batch, it fails due to Access Denied errors on S3 access. This is despite the fact that in the Job Definition, I assign an IAM role (created for Elastic Container Service Task) that has full S3 access.
If I launch my batch job with a command that does not access S3, it runs fine.
Since using an IAM role for the job definition seems not to be sufficient, how then do I grant S3 permissions within a Batch Job on AWS?
EDIT
So if I just run aws s3 ls interlinked
as my job, that also runs properly. What does not work is running the R script:
library(aws.s3)
get_bucket("mybucket")[[1]]
Which fails with Access Denied.
So it seems the issue is either with the aws.s3
package or, more likely, my use of it.