1
votes

So I started to use terraform and aws recently and I can't find a solution to my issue. I want to upload a python script file to a bucket in AWS and everything is fine. I am using this code provided by terraform:

resource "aws_s3_bucket_object" "object" {
  bucket = "your_bucket_name"
  key    = "new_object_key"
  source = "path/to/file"

  # The filemd5() function is available in Terraform 0.11.12 and later
  # For Terraform 0.11.11 and earlier, use the md5() function and the file() function:
  # etag = "${md5(file("path/to/file"))}"
  etag = filemd5("path/to/file")
}

But my issue is that there are some values in my script and I want to use environment variables instead of direct values. I know it can be done for a lambda function resource but what about a bucket file upload? How can I do that?

1

1 Answers

0
votes

Lambda function actually executes your code, so it can use env variables. Uploading to s3 bucket does not run your script, so it can use the env variables in the real-time.

You have to create your own custom solution for that, which would require you to preprocess your files on your local workstation before uploading them to S3. The preprocessing could be simple find-and-replace to substitute values based on your env variables.