2
votes

I have a terraform script that, after terraform apply, successfully launches an AWS spot instance and then runs a bash script. After the script finishes running and the creation is complete, I have been manually destroying the spot instance with terraform destroy. This is inconvenient, because I either have to watch my email for a CloudWatch alert or periodically check-in on the progress of the script. Ideally, I would be able to automatically destroy the AWS resources I created automatically. Does anyone know how I should go about doing this? Am I using the wrong AWS resources, i.e. should I be using ECS?

2
I guess AWS ECS Run Task or AWS Batch seems to be suitable.minamijoyo
How about aws lambda, if you can convert the bash script to python or other supported languages.BMW
@minamijoyo I know AWS Batch is not yet implemented into Terraform github.com/hashicorp/terraform/issues/12187, and I'm not sure about ECS Run Task. I would prefer to use Terraform for the infrastructure as code benefits.wherestheforce
@BMW I think I need bash.wherestheforce
ECS Run Task API runs one-shot task with docker container. However you need manage ECS container instances. Terraform defines resources statically. So if you want to destroy resources, some job control is required anyway.minamijoyo

2 Answers

2
votes

The solution I found is to create a null resource and then include the following provisioner after running my script.

  provisioner "remote-exec" {
     inline = [
        "sudo shutdown -h now",
     ]
  }
0
votes

You can create one lambda function and then call your shell script with in lambda.

You can schedule it with the help of cloudwatch, terminate on completion and can apply monitoring on it.

How to : Can bash script be written inside a AWS Lambda function