v8.2.0 seeing failed assertions leading to panics in terraform bridge
scttl opened this issue ยท 5 comments
Describe what happened
After upgrading from v8.1.0 to v8.2.0 we started encountering errors running pulumi preview
or pulumi deploy
of any gcp related resources:
Diagnostics:
gcp:serviceaccount:Account (python-service-account):
error: error reading from server: EOF
pulumi:pulumi:Stack (my_cool_staging_stack):
panic: fatal: An assertion has failed: The bridge does not accept secrets, so we should not encounter them here
goroutine 123 [running]:
github.com/pulumi/pulumi/sdk/v3/go/common/util/contract.failfast(...)
/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.130.0/go/common/util/contract/failfast.go:23
github.com/pulumi/pulumi/sdk/v3/go/common/util/contract.Assertf(0x60?, {0x57dac69?, 0xc005408140?}, {0x0?, 0x410a05?, 0x8?})
/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.130.0/go/common/util/contract/assert.go:35 +0xe8
github.com/pulumi/pulumi-terraform-bridge/v3/pkg/tfbridge.(*ConfigEncoding).UnfoldProperties(0xc00449c040, 0xc002006b70)
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/v3@v3.90.1-0.20240918114950-4954882afe70/pkg/tfbridge/config_encoding.go:143 +0x29a
github.com/pulumi/pulumi-terraform-bridge/pf/internal/configencoding.(*provider[...]).ConfigureWithContext(0xc00526bac0, {0x60e6d38?, 0xc002006990}, 0xc002006b70)
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/pf@v0.43.1-0.20240918114950-4954882afe70/internal/configencoding/provider.go:110 +0x45
github.com/pulumi/pulumi-terraform-bridge/pf/internal/plugin.(*provider).Configure(0x55c34e0?, {0x60e6d38?, 0xc002006990?}, {0x0?})
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/pf@v0.43.1-0.20240918114950-4954882afe70/internal/plugin/provider_context.go:143 +0x25
github.com/pulumi/pulumi-terraform-bridge/pf/internal/plugin.providerThunk.Configure({{0x610dcf0?, 0xc0023365c0?}}, {0x60e6d38, 0xc002006990}, {0xffffffffffffffff?})
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/pf@v0.43.1-0.20240918114950-4954882afe70/internal/plugin/provider_server.go:83 +0x76
github.com/pulumi/pulumi/sdk/v3/go/common/resource/plugin.(*providerServer).Configure(0xc0018a6510, {0x60e6d38, 0xc002006990}, 0xc00526b900)
/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.130.0/go/common/resource/plugin/provider_server.go:314 +0x331
github.com/pulumi/pulumi-terraform-bridge/pf/internal/plugin.(*providerServer).Configure(0x60ace60?, {0x60e6d38?, 0xc002006990?}, 0x1?)
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/pf@v0.43.1-0.20240918114950-4954882afe70/internal/plugin/provider_server.go:57 +0x25
github.com/pulumi/pulumi-terraform-bridge/x/muxer.(*muxer).Configure.func1()
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/x/muxer@v0.0.9-0.20240227144008-2da15b3d6f6e/muxer.go:298 +0x6f
github.com/pulumi/pulumi-terraform-bridge/x/muxer.asyncJoin[...].func1()
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/x/muxer@v0.0.9-0.20240227144008-2da15b3d6f6e/muxer.go:595 +0x42
created by github.com/pulumi/pulumi-terraform-bridge/x/muxer.asyncJoin[...] in goroutine 121
/home/runner/go/pkg/mod/github.com/pulumi/pulumi-terraform-bridge/x/muxer@v0.0.9-0.20240227144008-2da15b3d6f6e/muxer.go:594 +0x79
The identical code snippet (below) previews and deploys fine on v8.1.0
Sample program
index.ts;
import * as gcp from '@pulumi/gcp';
export = async () => {
const pythonServiceAccount = new gcp.serviceaccount.Account(
'python-service-account',
{
accountId: `sa-python`,
}
);
};
gcp related vars like project
and credentials
are defined in a config Pulumi.stack.yaml
Log output
No response
Affected Resource(s)
No response
Output of pulumi about
โฏ pulumi about (irrelevant values redacted via ###
)
CLI
Version 3.122.0
Go Version go1.22.7
Go Compiler gc
Plugins
NAME VERSION
cloudflare 5.39.0
docker 4.5.5
gcp 8.2.0
nodejs unknown
postgresql 3.12.0
random 4.16.5
Host
OS nixos
Version 24.05 (Uakari)
Arch x86_64
This project is written in nodejs: executable='/etc/profiles/per-user/scott/bin/node' version='v20.15.1'
Current Stack: ###
TYPE URN
pulumi:pulumi:Stack urn:pulumi:###
pulumi:providers:gcp urn:pulumi:###::pulumi:providers:gcp::default_8_2_0
gcp:serviceaccount/account:Account urn:pulumi:###::gcp:serviceaccount/account:Account::python-service-account
Found no pending operations associated with ###
Backend
Name pulumi.com
URL https://app.pulumi.com/###
User ###
Organizations ###
Token type personal
Dependencies:
NAME VERSION
@pulumi/cloudflare 5.39.0
@pulumi/random 4.16.5
@types/node-forge 1.3.11
@pulumi/docker 4.5.5
@pulumi/gcp 8.2.0
@pulumi/postgresql 3.12.0
@pulumi/pulumi 3.133.0
@types/node 22.5.5
node-forge 1.3.1
Pulumi locates its logs in /tmp by default
Additional context
No response
Contributing
Vote on this issue by adding a ๐ reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).
Hi @scttl, thank you for filing this issue.
I could reproduce this issue with the following steps:
- Set
project
:pulumi config set gcp:project <our-project>
- Set
credentials
:pulumi config set gcp:credentials <path-to-file>--secret
<-- this is the issue - Run
pulumi up
with v 8.2.0.
Additionally I've verified this does not panic on v8.1.0.
Furthermore, this does not panic on v8.2.0 unless the config value is set to --secret
. But of course you need to be able to make your credentials value secret.
We're hoping to have a fix for this soon. Thank you again for reporting.
Seeing the same error elsewhere: pulumi/pulumi-hcloud#603
Hi @scttl - we have reverted the bridge change that caused this behavior. This week's new GCP release will incorporate that version. I've verified with a local build that the changes address your issue.
Once pulumi-gcp v8.3.0 is merged and released you should be able to upgrade to that version.
Please let us know if you continue to see trouble - thank you very much for your patience here. ๐
8.3.0 is released.
Please upgrade to 8.3.0. We apologize for the trouble.
Thanks @guineveresaenger ! v8.3.0 works well for me too.