I'm trying to setup an AWS Batch job that is triggered from a Cloudwatch Event on a S3 PutObject into a bucket. I have the job running when the new file is added, but am not sure how to pass the filename to the job. In my Cloudwatch Event Rule, I set the Configure input
to Matched event
for the batch job queue, but am not sure how to access the event in the docker container that the job is running.
3
votes
I've tried printing the environment variables in the container, but am not seeing anything that looks like the event.
- Travis Nelson
As a work around, I am using the same rule to trigger the job and add the event to an SQS queue for the job to read and delete.
- Travis Nelson
1 Answers
0
votes
As described here: https://docs.aws.amazon.com/batch/latest/userguide/batch-cwe-target.html, towards the bottom, in the section titled "Passing Event Information to an AWS Batch Target using the CloudWatch Events Input Transformer":
Set your job definition up like so:
{
"jobDefinitionName": "echo-parameters",
"containerProperties": {
"image": "busybox",
"vcpus": 2,
"memory": 2000,
"command": [
"echo",
"Ref::S3bucket",
"Ref::S3key"
]
}
}
This assumes your docker image contains a command echo
which takes 2 parameters - an S3 bucket and an S3 key.
In your CloudWatch Event rule, use an InputTransformer to transform the event details into what you want:
"InputTransformer": {
"InputPathsMap" : {
"S3BucketValue": "$.detail.requestParameters.bucketName",
"S3KeyValue": "$.detail.requestParameters.key"
},
"InputTemplate" : "{\"S3bucket\": <S3BucketValue>, \"S3key\": <S3KeyValue>}"
}