0
votes

I have automation on EC2 instance - loading different applications with credentials and keys. I do not want to hardcode the keys - need a way to run automation with terraform. What I thought doing so far:

1)create bash script locally and upload the script in terrafom using provisioner "file"

provisioner "file" {
source      = "keys.sh"
destination = "keys.sh"

** the bash script contains EXPORT to environment variables**

#!/bin/bash
#PLEASE ADD GOT .GITIGNORE


export REPO_SECRET=SoMeRepOSceRt

cat << EOF > /home/ubuntu/.ssh/privatekey
----KEY-RSA---STARS--HERE
AjsdhKKKSSL
--END--Key
EOF

The issue is that I can't source the file using "remote-exec inline source ./file" - getting error source not found.

provisioner "remote-exec" {

    inline = [
    "chmod +x /home/ubuntu/keys.sh",
    "source /home/ubuntu/keys.sh",
    ]

using (dot) . ./script not working also.

aws_instance.TestBashEnv (remote-exec): /tmp/terraform_1852118673.sh: 1: source: not found

So I need a way to source the file in automation when EC2 instaces is created so the env variables in the bash script will be applied.

**P.S - will be nice to have a fetature that checks if the bash file keys.sh is uploaded durring terraform apply setup (fileexists?)

2
Shouldn't running the script with remote-exec suffice your need? It should run the commands from the script and make the env var and private key available hence you don't need to use source in this case. - Marko E

2 Answers

0
votes

Here's how I'd go about this one.

provisioner "file" {
  source      = "keys.sh"
  destination = "/home/ubuntu/keys.sh"
}

followed by:

provisioner "remote-exec" {
  inline = [
    "chmod +x /home/ubuntu/keys.sh",
    "/home/ubuntu/keys.sh",
  ]
}

This is explained in the Terraform documentation for remote-exec more thoroughly.

0
votes

The method I tried wont work. Each bash script that wants to use keys need to source keys.sh at the start of the files.