Get ansible_ resources from multiple Remote state
Roxyrob opened this issue · 5 comments
Having terraform configuration in different directories/account for secuirity/sepration o duty or something else, can be useful terraform inventory allow to read states from other directories out of the main as current or defined by ANSIBLE_TF_DIR.
E.g. gathering ansible configuration from multimple remote state (multiple directoies in ANSIBLE_TF_DIR), to work like multiple terraform_remote_state in terraform:
data "terraform_remote_state" "dir1" {...}
data "terraform_remote_state" "dir2" {...}
...
data "terraform_remote_state" "dirn" {...}
Every directory will have its backend with aws connection config ti allow this design, probably.
I'll have to put some thought into this use case, but I don't think it's something that would be easy to support.
The way this script pulls the state now, it just takes a dump of whatever state Terraform is configured to use. But remote sate data isn't part of that sate - data providers are populated at runtime and not persisted to the state file.
You could still depend on values from remote state in the Ansible resources you configure within Terraform config, but all of the hosts for this plugin still need to be in the same project.
A Project can be divided into multiple tfstate (different directories for different purpose), continuing to be a single project (for this terraform ).
I think the script read backend.tf to get tfstate with ansible_ configuration right ? If so, cannot you read ansible_ from all tfstates pointed by backend.tf in all directories specified in ANSIBLE_TF_DIR (or ANSIBLE_TF_DIRS) ?
I think the script read backend.tf to get tfstate with ansible_ configuration right ? If so, cannot you read ansible_ from all tfstates pointed by backend.tf in all directories specified in ANSIBLE_TF_DIR (or ANSIBLE_TF_DIRS) ?
Yes, in theory we could read multiple TF States from different project directories. But this is something altogether different from reading remote state data providers. Amongst other things, using multiple project directories requires that you have the Terraform config files checked-out locally. Since the remote state providers uses outputs as a common interface between modules, this wouldn't be necessary with a remote-state configuration, but you would also be violating the outputs-as-dependencies contract from the Terraform design.
I'm pretty sure you could already do multiple-local-projects with Ansible, by using it's own multiple inventory sources support. Maybe by passing it a wrapper script that sets environment variables for each project and then calls out to the terraform-inventory
script, something like this:
/etc/ansible/ProjectA
#!/usr/bin/env sh
export ANSIBLE_TF_DIR=/home/somebody/projects/A
exec /etc/ansible/terraform.py
/etc/ansible/ProjectB
#!/usr/bin/env sh
export ANSIBLE_TF_DIR=/home/somebody/projects/B
exec /etc/ansible/terraform.py
ansible-playbook play.yml -i /etc/ansible/ProjectA -i /etc/ansible/ProjectB
I think the script read backend.tf to get tfstate with ansible_ configuration right?
To clarify this point... no the script does not try to parse any Terraform files. It calls terraform state pull
and reads STDOUT to get the state as JSON, letting Terraform figure out where to get the state data from. That way, we can support all remote state providers with zero work from the inventory script.
I think It is the same. Before you can pull terraform state, you need to know where the state is and how to have access to it. If not on local disk backend.tf (for any remote state configuration) is what allow your script to pull state.
I'll give a try to multiple inventory "as scripts" using different env var.
I want to thank you for this dynamic inventory script and the terraform ansible provider (a great solution).