Terraform v0.10.7
I am in a process of putting terraform modules together for our DevOps team. There will be separate modules for each component that we are using and in turn will create entire stack using those modules for different environments and requirements.
The directory hierarchy is as follows:
Terraform
- modules
- module-1 (main.tf, vars.tf, outputs.tf, backend.tf)
- module-2 (main.tf, vars.tf, outputs.tf, backend.tf)
- module-3 (main.tf, vars.tf, outputs.tf, backend.tf)
...
- environments
- qa (main.tf, vars.tf, outputs.tf, bakend.tf)
- stage (main.tf, vars.tf, outputs.tf, bakend.tf)
- prod (main.tf, vars.tf, outputs.tf, bakend.tf)
In backend.tf I have specified backend as S3 and a complete hierarchy as /resources/mod-1/terraform.tfstate
. Same thing applies for backend.tf in environments.
When I give terraform get
and terraform apply
for any environment, it will fetch all the modules specified and will apply the changes to AWS infrastructure and it will store terraform.tfstate of that env at specified location in S3.
So the question is, will the terraform.tfstate for all the modules used in environment will also get generated and pushed to S3 (with single apply to env)?
I haven't ran terraform apply
to any modules.
As I have a plan to use some data from terraform.tfstate of those modules from S3 and at the same time want to avoid giving multiple applies to those modules and give single terraform apply
to environment. How can this be achieved?