Only a subset of datadog_logs_custom_pipeline imported
darkn3rd opened this issue · 3 comments
darkn3rd commented
When I use terraformer
for the datadog provider, only 2 out of 24 logs_custom_pipeline
resources are imported.
STEPS
export DATADOG_HOST="https://app.datadoghq.com/apm/home"
export DATADOG_API_KEY="$(awk -F'"' '/datadog_api_key/{ print $2 }' terraform.tfvars)"
export DATADOG_APP_KEY="$(awk -F'"' '/datadog_app_key/{ print $2 }' terraform.tfvars)"
terraformer import datadog --resources='*'
ACTUAL RESULTS
Only 2 resources imported.
grep ^resource ./generated/datadog/logs_custom_pipeline/logs_custom_pipeline.tf | wc -l
# 2
EXPECTED RESULTS
I expected 24 resources to be imported.
Using the pipelines api, I can get 24 resources using the same key.
curl --silent \
--request GET https://api.datadoghq.com/api/v1/logs/config/pipelines \
--header Accept: application/json \
--header "DD-API-KEY: ${DATADOG_API_KEY}" \
--header "DD-APPLICATION-KEY: ${DATADOG_APP_KEY}" \
> output.json
jq -r '.[].name' output.json | wc -l
# 24
darkn3rd commented
Upon further research, pipelines that have is_read_only set to true were not included. It would be nice to have this behavior documented.