I have automation on EC2 instance - loading different applications with credentials and keys. I do not want to hardcode the keys - need a way to run automation with terraform. What I thought doing so far:
1)create bash script locally and upload the script in terrafom using provisioner "file"
provisioner "file" {
source = "keys.sh"
destination = "keys.sh"
** the bash script contains EXPORT to environment variables**
#!/bin/bash
#PLEASE ADD GOT .GITIGNORE
export REPO_SECRET=SoMeRepOSceRt
cat << EOF > /home/ubuntu/.ssh/privatekey
----KEY-RSA---STARS--HERE
AjsdhKKKSSL
--END--Key
EOF
The issue is that I can't source the file using "remote-exec inline source ./file" - getting error source not found.
provisioner "remote-exec" {
inline = [
"chmod +x /home/ubuntu/keys.sh",
"source /home/ubuntu/keys.sh",
]
using (dot) . ./script not working also.
aws_instance.TestBashEnv (remote-exec): /tmp/terraform_1852118673.sh: 1: source: not found
So I need a way to source the file in automation when EC2 instaces is created so the env variables in the bash script will be applied.
**P.S - will be nice to have a fetature that checks if the bash file keys.sh is uploaded durring terraform apply setup (fileexists?)
remote-exec
suffice your need? It should run the commands from the script and make the env var and private key available hence you don't need to usesource
in this case. - Marko E