3
votes

I've been working terraform for the last couple of months. Reading up on this subject you automatically hit items like configuring a S3 backend for your state file instead of using the file locally.

But I can't find a good approach for starting a new terraform project when you want the terraform.tfstate file to be saved in the s3 bucket from start.

All the documentation I come across on this subject is talking about creating a S3 bucket, creating a DymanoDB table, doing a new init with the S3 backend settings configured. But all these steps rely on the fact that there already is a local terraform.tfstate file available.

How does is work when you want to start a new project when there is no tfstate file available yet?

What I tried so for.

To start working with my env-t project I first run this system command to set all the correct s3 backend settings. ( I've adopted this construction so I can easily switch between different env's which all need the same TF code to be applied. Variation is done by using a var file. )

FILE set_env_env-t.sh

#!/bin/sh
export TF_VAR_CMDLINE_environment=env-t
export TF_VAR_CMDLINE_tf_state_bucket=tfstate-files-env-t
export TF_VAR_CMDLINE_tf_state_table=tfstate-locks-env-t
export TF_VAR_CMDLINE_region=eu-west10001

export AWS_PROFILE=$TF_VAR_CMDLINE_environment

/tmp/terraform init -backend-config "bucket=$TF_VAR_CMDLINE_tf_state_bucket" -backend-config "dynamodb_table=$TF_VAR_CMDLINE_tf_state_table" -backend-config "region=$TF_VAR_CMDLINE_region" -backend-config "key=$TF_VAR_CMDLINE_environment/terraform.tfstate"

Then in my main.tf I use this code so the previously set backend config is also available within the TF script.

variable "CMDLINE_environment"                   {}
variable "CMDLINE_tf_state_bucket"               {}
variable "CMDLINE_tf_state_table"                {}
variable "CMDLINE_region"                                {}

terraform {
  backend "s3" {
  }
}

data "terraform_remote_state" "state" {
  backend = "s3"
  config = {
    profile        = var.CMDLINE_environment
    bucket         = var.CMDLINE_tf_state_bucket
    dynamodb_table = var.CMDLINE_tf_state_table
    region         = var.CMDLINE_region
    key            = "${var.CMDLINE_environment}/terraform.tfstate"
  }
}

Setting the S3 backend all works okay.

. ./set_env_env-t.sh

But once I run this plan command it bails out with the error that there is no tfstate available in the S3 backend yeah that's correct this is my first run.....

terraform plan -var-file=env-t/vars.tfvars

Error: Unable to find remote state

on main.tf line 53, in data "terraform_remote_state" "state": 53: data "terraform_remote_state" "state" {

No stored state was found for the given workspace in the given backend.

The only way around this I now have is first work with a local terraform.tfstate file configure all the stuff req. for the S3 backend. Do a new init then terraform will detect the local state file and offer the option to move it to the S3 bucket.

Is there are better/easier way around this?

2

2 Answers

1
votes

You can add s3 at the backend by putting this code in the main.tf. you don't need data "terraform_remote_state" "state" {....}

terraform {
  backend "s3" {
    bucket = "bucket-name"
    key    = "path/to/key/terraform.tfstate"
    region = "us-west-2"
  }
}
0
votes

You just have to provide an appropriate credentials to:

terraform {
  backend "s3" {
    bucket = "mybucket"
    key    = "path/to/my/key"
    region = "us-east-1"
  }
}

To store you state file in the S3 you don't need data.terraform_remote_state source.