sethvargo/vault-secrets-gen

Unable to register plugin in local setup (Minikube)

Poweranimal opened this issue · 8 comments

Hi there,

I'm not able to register the plugin in my local Minikube setup.

Setting up the plugin in my remote cluster (AWS EKS with vault storage s3) works flawless.

However, the same setup (with the only difference that as storage a local docker volume is used instead of s3) fails on my local setup running in Minikube.

Running the cluster locally without any modifications causes the following error during plugin registration:

2021-12-25T14:15:38.218Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-25T14:15:38.340Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_d8b0cd67.vault-secrets-gen: starting plugin: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4 args=["/vault/plugins/vault-secrets-gen_v0.1.4"]
2021-12-25T14:15:38.341Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_d8b0cd67.vault-secrets-gen: plugin started: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4 pid=59
2021-12-25T14:15:38.341Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_d8b0cd67.vault-secrets-gen: waiting for RPC address: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4
2021-12-25T14:15:38.352Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_d8b0cd67.vault-secrets-gen.vault-secrets-gen_v0.1.4: 2021/12/25 14:15:38 cannot allocate memory: metadata=true
2021-12-25T14:15:38.353Z [ERROR] secrets.system.system_91946a49: error occurred during enable mount: path=gen/
  error=
  | Unrecognized remote plugin message: 
  | 
  | This usually means that the plugin is either invalid or simply
  | needs to be recompiled to support the latest protocol.
  
2021-12-25T14:15:38.353Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_d8b0cd67.vault-secrets-gen: plugin process exited: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4 pid=59 error="exit status 1"

Despite mlock being supported:

telemetry.disable_hostname has been set to false. Recommended setting is true for Prometheus to avoid poorly named metrics.
==> Vault server configuration:

             Api Address: https://vault.vault:8200
                     Cgo: disabled
         Cluster Address: https://vault.vault:8201
              Go Version: go1.17.5
              Listener 1: tcp (addr: "0.0.0.0:8200", cluster address: "0.0.0.0:8201", max_request_duration: "1m30s", max_request_size: "33554432", tls: "enabled")
               Log Level: trace
                   Mlock: supported: true, enabled: true
           Recovery Mode: false
                 Storage: file
                 Version: Vault v1.9.2
             Version Sha: f4c6d873e2767c0d6853b5d9ffc77b0d297bfbdf

==> Vault server started! Log data will stream in below:

If I disable mlock mlock_disable = true, I'm able to register the plugin, however any requests to the plugin timeout:

2021-12-25T14:33:52.942Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: starting plugin: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4 args=["/vault/plugins/vault-secrets-gen_v0.1.4"]
2021-12-25T14:33:52.942Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: plugin started: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4 pid=32
2021-12-25T14:33:52.942Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: waiting for RPC address: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4
2021-12-25T14:33:52.952Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen.vault-secrets-gen_v0.1.4: plugin address: metadata=true address=/tmp/plugin820556143 network=unix timestamp=2021-12-25T14:33:52.952Z
2021-12-25T14:33:52.952Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: using plugin: metadata=true version=4
2021-12-25T14:33:52.953Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: setup: transport=gRPC status=started
2021-12-25T14:33:52.953Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen.stdio: waiting for stdio data: metadata=true
2021-12-25T14:33:52.955Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: setup: transport=gRPC status=finished err=<nil> took=2.359483ms
2021-12-25T14:33:52.955Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: special paths: transport=gRPC status=started
2021-12-25T14:33:52.958Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: special paths: transport=gRPC status=finished took=2.220813ms
2021-12-25T14:33:52.958Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: type: transport=gRPC status=started
2021-12-25T14:33:52.960Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: type: transport=gRPC status=finished took=1.92913ms
2021-12-25T14:33:52.960Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: cleanup: transport=gRPC status=started
2021-12-25T14:33:52.961Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: cleanup: transport=gRPC status=finished took="849.894µs"
2021-12-25T14:33:52.961Z [WARN]  secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2021-12-25T14:33:52.961Z [WARN]  secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: plugin failed to exit gracefully: metadata=true
2021-12-25T14:33:52.961Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen.stdio: received EOF, stopping recv loop: metadata=true err="rpc error: code = Unavailable desc = error reading from server: read unix @->/tmp/plugin820556143: use of closed network connection"
2021-12-25T14:33:52.963Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_0c1d239c.vault-secrets-gen: plugin process exited: metadata=true path=/vault/plugins/vault-secrets-gen_v0.1.4 pid=32 error="signal: killed"

I highly appreciate your help.

I'm not sure why mlock would matter - the error message does not indicate anything to do with mlock issues.

From the logs, it looks like the connection to the plugin was prematurely closed. Without more logs, it's difficult to know why. It could be lack of available memory, inability to talk to /dev/urand (or system equivalent), etc. It would be very helpful to have:

  1. Version of minikube
  2. Any configuration files or YAML
  3. Exact versions of Vault, vault-secrets-gen, and Vault's configuration file
  4. How vault-secrets-gen is being installed and configured
  5. Any other "weird" configuration (TLS, network setup, etc)

Thank you very much for your reply.

NOTE: Please dismiss my previously shared logs. The logs of this message are correct.

Here are your requested information:

Minikube Version

# minikube version
minikube version: v1.23.2
commit: 0a0ad764652082477c00d51d2475284b5d39ceed

Vault Version

# vault --version
Vault v1.9.2 (f4c6d873e2767c0d6853b5d9ffc77b0d297bfbdf)

vault-secrets-gen Version

v0.1.4

Configuration files

I'm using bank-vaults operator to configure vault.

Vault-CRD
apiVersion: vault.banzaicloud.com/v1alpha1
kind: Vault
metadata:
  name: vault
  namespace: vault
spec:
  caNamespaces:
    - "*"
  config:
    api_addr: https://vault.vault:8200
    listener:
      tcp:
        address: 0.0.0.0:8200
        tls_cert_file: /vault/tls/server.crt
        tls_key_file: /vault/tls/server.key
    log_level: Trace
    plugin_directory: /vault/plugins
    storage:
      file:
        path: /vault/file
    telemetry:
      statsd_address: localhost:9125
    ui: true
  externalConfig:
    auth:
      - roles:
          - bound_service_account_names:
              - consul-server
            bound_service_account_namespaces:
              - consul
            name: consul-server
            policies:
              - consul-server
              - consul-managed-pki
            ttl: 1h
          - bound_service_account_names:
              - consul-client
            bound_service_account_namespaces:
              - consul
            name: consul-client
            policies:
              - consul-server
              - consul-managed-pki
            ttl: 1h
          - bound_service_account_names: "*"
            bound_service_account_namespaces:
              - consul
              - prometheus
            name: consul-ca
            policies: consul-ca
            ttl: 1h
          - bound_service_account_names: "*"
            bound_service_account_namespaces:
              - vault
              - redis
              - postgres
              - prometheus
            name: vault-secrets
            policies: allow-secrets,gen-password
            ttl: 1h
        type: kubernetes
    plugins:
      - command: vault-secrets-gen
        plugin_name: vault-secrets-gen
        sha256: 99b766bfde9ac55f744b34913ac26d6ee2f2751bd00c1f35d72f1d976f98f6d4
        type: secret
    policies:
      - name: consul-managed-pki
        rules: |-
          path "/sys/mounts" {
            capabilities = [ "read" ]
          }

          path "/sys/mounts/connect_root" {
            capabilities = [ "create", "read", "update", "delete", "list" ]
          }

          path "/sys/mounts/connect_inter" {
            capabilities = [ "create", "read", "update", "delete", "list" ]
          }

          path "/connect_root/*" {
            capabilities = [ "create", "read", "update", "delete", "list" ]
          }

          path "/connect_inter/*" {
            capabilities = [ "create", "read", "update", "delete", "list" ]
          }
      - name: allow-secrets
        rules: |-
          path "secret/*" {
              capabilities = ["create", "read", "update", "delete", "list"]
          }
      - name: gen-password
        rules: |-
          path "gen/password" {
              capabilities = ["create", "update"]
          }
      - name: consul-server
        rules: |-
          path "pki/issue/consul-server" {
            capabilities = ["create", "update"]
          }
      - name: consul-ca
        rules: |-
          path "pki/cert/ca" {
              capabilities = ["read"]
          }
    secrets:
      - description: General secrets.
        options:
          version: "2"
        path: secret
        type: kv
      - description: A password generator.
        path: gen
        plugin_name: vault-secrets-gen
        type: plugin
      - configuration:
          config:
            max_lease_ttl: 87600h
          roles:
            - allow_bare_domains: true
              allow_localhost: true
              allow_subdomains: true
              allowed_domains: dc1.consul,consul-server,consul-server.consul,consul-server.consul.svc,consul.service.consul
              generate_lease: true
              max_ttl: 720h
              name: consul-server
          root/generate:
            - common_name: dc1.consul
              name: internal
        description: Consul PKI backend.
        type: pki
  image: docker.io/vault:1.9.2
  serviceAccount: vault
  size: 1
  unsealConfig:
    kubernetes:
      secretNamespace: vault
  vaultEnvsConfig:
    - name: VAULT_CACERT
      value: /vault/tls/ca.crt
  vaultInitContainers:
    - args:
        - apk add libcap && cd /tmp && wget -P /tmp https://github.com/sethvargo/vault-secrets-gen/releases/download/v0.1.4/vault-secrets-gen_0.1.4_linux_amd64.zip && unzip -d /tmp /tmp/vault-secrets-gen_0.1.4_linux_amd64.zip && install -o 100 -g 1000 /tmp/vault-secrets-gen_v0.1.4 /vault/plugins/vault-secrets-gen && setcap cap_ipc_lock=+ep /vault/plugins/vault-secrets-gen
      command:
        - ash
        - -ec
      image: alpine
      imagePullPolicy: IfNotPresent
      name: secrets-gen
      resources: {}
      volumeMounts:
        - mountPath: /vault/plugins
          name: vault-plugins
  volumeMounts:
    - mountPath: /vault/plugins
      name: vault-plugins
  volumes:
    - emptyDir: {}
      name: vault-plugins

vault-secrets-gen Installation

vault-secrets-gen is installed by the vault-crd. Defined here: spec/vaultInitContainers
This is the executed command:

apk add libcap &&
cd /tmp &&
wget -P /tmp https://github.com/sethvargo/vault-secrets-gen/releases/download/v0.1.4/vault-secrets-gen_0.1.4_linux_amd64.zip &&
unzip -d /tmp /tmp/vault-secrets-gen_0.1.4_linux_amd64.zip &&
install -o 100 -g 1000 /tmp/vault-secrets-gen_v0.1.4 /vault/plugins/vault-secrets-gen &&
setcap cap_ipc_lock=+ep /vault/plugins/vault-secrets-gen

The vault-crd registers and enables the vault-secrets-gen. Can be seen here spec/externalConfig/plugins

Logs

Installing the vault-crd produces the following logs on initialization:

kubectl -n vault logs vault-0 vault
telemetry.disable_hostname has been set to false. Recommended setting is true for Prometheus to avoid poorly named metrics.
==> Vault server configuration:

             Api Address: https://vault.vault:8200
                     Cgo: disabled
         Cluster Address: https://vault.vault:8201
              Go Version: go1.17.5
              Listener 1: tcp (addr: "0.0.0.0:8200", cluster address: "0.0.0.0:8201", max_request_duration: "1m30s", max_request_size: "33554432", tls: "enabled")
               Log Level: trace
                   Mlock: supported: true, enabled: true
           Recovery Mode: false
                 Storage: file
                 Version: Vault v1.9.2
             Version Sha: f4c6d873e2767c0d6853b5d9ffc77b0d297bfbdf

==> Vault server started! Log data will stream in below:

2021-12-28T15:53:25.896Z [INFO]  proxy environment: http_proxy="\"\"" https_proxy="\"\"" no_proxy="\"\""
2021-12-28T15:53:25.897Z [DEBUG] core: set config: sanitized config="{\"api_addr\":\"https://vault.vault:8200\",\"cache_size\":0,\"cluster_addr\":\"\",\"cluster_cipher_suites\":\"\",\"cluster_name\":\"\",\"default_lease_ttl\":0,\"default_max_request_duration\":0,\"disable_cache\":false,\"disable_clustering\":false,\"disable_indexing\":false,\"disable_mlock\":false,\"disable_performance_standby\":false,\"disable_printable_check\":false,\"disable_sealwrap\":false,\"disable_sentinel_trace\":false,\"enable_response_header_hostname\":false,\"enable_response_header_raft_node_id\":false,\"enable_ui\":true,\"listeners\":[{\"config\":{\"address\":\"0.0.0.0:8200\",\"tls_cert_file\":\"/vault/tls/server.crt\",\"tls_key_file\":\"/vault/tls/server.key\"},\"type\":\"tcp\"}],\"log_format\":\"unspecified\",\"log_level\":\"Trace\",\"max_lease_ttl\":0,\"pid_file\":\"\",\"plugin_directory\":\"/vault/plugins\",\"raw_storage_endpoint\":false,\"seals\":[{\"disabled\":false,\"type\":\"shamir\"}],\"storage\":{\"cluster_addr\":\"\",\"disable_clustering\":false,\"redirect_addr\":\"https://vault.vault:8200\",\"type\":\"file\"},\"telemetry\":{\"add_lease_metrics_namespace_labels\":false,\"circonus_api_app\":\"\",\"circonus_api_token\":\"\",\"circonus_api_url\":\"\",\"circonus_broker_id\":\"\",\"circonus_broker_select_tag\":\"\",\"circonus_check_display_name\":\"\",\"circonus_check_force_metric_activation\":\"\",\"circonus_check_id\":\"\",\"circonus_check_instance_id\":\"\",\"circonus_check_search_tag\":\"\",\"circonus_check_tags\":\"\",\"circonus_submission_interval\":\"\",\"circonus_submission_url\":\"\",\"disable_hostname\":false,\"dogstatsd_addr\":\"\",\"dogstatsd_tags\":null,\"lease_metrics_epsilon\":3600000000000,\"maximum_gauge_cardinality\":500,\"metrics_prefix\":\"\",\"num_lease_metrics_buckets\":168,\"prometheus_retention_time\":86400000000000,\"stackdriver_debug_logs\":false,\"stackdriver_location\":\"\",\"stackdriver_namespace\":\"\",\"stackdriver_project_id\":\"\",\"statsd_address\":\"localhost:9125\",\"statsite_address\":\"\",\"usage_gauge_period\":600000000000}}"
2021-12-28T15:53:25.897Z [DEBUG] storage.cache: creating LRU cache: size=0
2021-12-28T15:53:25.924Z [INFO]  core: Initializing VersionTimestamps for core
2021-12-28T15:53:25.925Z [DEBUG] cluster listener addresses synthesized: cluster_addresses=[0.0.0.0:8201]
2021-12-28T15:53:25.925Z [DEBUG] would have sent systemd notification (systemd not present): notification=READY=1
2021-12-28T15:53:25.974Z [INFO]  core: security barrier not initialized
2021-12-28T15:53:26.101Z [INFO]  core: security barrier not initialized
2021-12-28T15:53:26.133Z [INFO]  core: security barrier not initialized
2021-12-28T15:53:26.134Z [INFO]  core: security barrier initialized: stored=1 shares=5 threshold=3
2021-12-28T15:53:26.135Z [DEBUG] core: cluster name not found/set, generating new
2021-12-28T15:53:26.135Z [DEBUG] core: cluster name set: name=vault-cluster-28c863f4
2021-12-28T15:53:26.135Z [DEBUG] core: cluster ID not found, generating new
2021-12-28T15:53:26.135Z [DEBUG] core: cluster ID set: id=01f6e636-c19a-0626-5278-03ed3e333b80
2021-12-28T15:53:26.135Z [INFO]  core: post-unseal setup starting
2021-12-28T15:53:26.135Z [DEBUG] core: clearing forwarding clients
2021-12-28T15:53:26.135Z [DEBUG] core: done clearing forwarding clients
2021-12-28T15:53:26.135Z [DEBUG] core: persisting feature flags
2021-12-28T15:53:26.138Z [INFO]  core: loaded wrapping token key
2021-12-28T15:53:26.138Z [INFO]  core: Recorded vault version: vault version=1.9.2 upgrade time="2021-12-28 15:53:26.138480902 +0000 UTC m=+0.338244304"
2021-12-28T15:53:26.138Z [INFO]  core: upgrading plugin information: plugins=[]
2021-12-28T15:53:26.138Z [INFO]  core: successfully setup plugin catalog: plugin-directory=/vault/plugins
2021-12-28T15:53:26.138Z [INFO]  core: no mounts; adding default mount table
2021-12-28T15:53:26.139Z [INFO]  core: successfully mounted backend: type=cubbyhole path=cubbyhole/
2021/12/28 15:53:26 [ERR] Error writing to statsd! Err: write udp 127.0.0.1:45067->127.0.0.1:9125: write: connection refused
2021-12-28T15:53:26.140Z [INFO]  core: successfully mounted backend: type=system path=sys/
2021-12-28T15:53:26.140Z [INFO]  core: successfully mounted backend: type=identity path=identity/
2021-12-28T15:53:26.143Z [INFO]  core: successfully enabled credential backend: type=token path=token/
2021-12-28T15:53:26.143Z [TRACE] expiration.job-manager: created dispatcher: name=expire-dispatcher num_workers=200
2021-12-28T15:53:26.143Z [INFO]  rollback: starting rollback manager
2021-12-28T15:53:26.143Z [TRACE] expiration.job-manager: initialized dispatcher: num_workers=200
2021-12-28T15:53:26.143Z [TRACE] expiration.job-manager: created job manager: name=expire pool_size=200
2021-12-28T15:53:26.143Z [TRACE] expiration.job-manager: starting job manager: name=expire
2021-12-28T15:53:26.143Z [TRACE] expiration.job-manager: starting dispatcher
2021-12-28T15:53:26.143Z [INFO]  core: restoring leases
2021-12-28T15:53:26.144Z [DEBUG] expiration: collecting leases
2021-12-28T15:53:26.146Z [DEBUG] expiration: leases collected: num_existing=0
2021-12-28T15:53:26.146Z [INFO]  expiration: lease restore complete
2021-12-28T15:53:26.147Z [DEBUG] identity: loading entities
2021-12-28T15:53:26.147Z [DEBUG] identity: entities collected: num_existing=0
2021-12-28T15:53:26.147Z [INFO]  identity: entities restored
2021-12-28T15:53:26.147Z [DEBUG] identity: identity loading groups
2021-12-28T15:53:26.147Z [DEBUG] identity: groups collected: num_existing=0
2021-12-28T15:53:26.147Z [INFO]  identity: groups restored
2021-12-28T15:53:26.147Z [DEBUG] identity: identity loading OIDC clients
2021-12-28T15:53:26.147Z [TRACE] activity: scanned existing logs: out=[]
2021-12-28T15:53:26.148Z [INFO]  core: post-unseal setup complete
2021-12-28T15:53:26.148Z [TRACE] activity: no intent log found
2021-12-28T15:53:26.148Z [TRACE] activity: scanned existing logs: out=[]
2021-12-28T15:53:26.149Z [INFO]  core: root token generated
2021-12-28T15:53:26.149Z [INFO]  core: pre-seal teardown starting
2021-12-28T15:53:26.149Z [DEBUG] expiration: stop triggered
2021-12-28T15:53:26.149Z [TRACE] expiration.job-manager: terminating job manager...
2021-12-28T15:53:26.149Z [TRACE] expiration.job-manager: terminating dispatcher
2021-12-28T15:53:26.149Z [DEBUG] expiration: finished stopping
2021-12-28T15:53:26.149Z [INFO]  rollback: stopping rollback manager
2021-12-28T15:53:26.149Z [INFO]  core: pre-seal teardown complete
2021-12-28T15:53:29.905Z [DEBUG] core: unseal key supplied: migrate=false
2021-12-28T15:53:29.905Z [DEBUG] core: cannot unseal, not enough keys: keys=1 threshold=3 nonce=346125bd-5626-6717-e423-89ab9aa95302
2021-12-28T15:53:30.105Z [DEBUG] core: unseal key supplied: migrate=false
2021-12-28T15:53:30.105Z [DEBUG] core: cannot unseal, not enough keys: keys=2 threshold=3 nonce=346125bd-5626-6717-e423-89ab9aa95302
2021-12-28T15:53:30.305Z [DEBUG] core: unseal key supplied: migrate=false
2021-12-28T15:53:30.305Z [DEBUG] core: starting cluster listeners
2021-12-28T15:53:30.305Z [INFO]  core.cluster-listener.tcp: starting listener: listener_address=0.0.0.0:8201
2021-12-28T15:53:30.305Z [INFO]  core.cluster-listener: serving cluster requests: cluster_listen_address=[::]:8201
2021-12-28T15:53:30.306Z [INFO]  core: post-unseal setup starting
2021-12-28T15:53:30.306Z [DEBUG] core: clearing forwarding clients
2021-12-28T15:53:30.306Z [DEBUG] core: done clearing forwarding clients
2021-12-28T15:53:30.306Z [DEBUG] core: persisting feature flags
2021-12-28T15:53:30.306Z [INFO]  core: loaded wrapping token key
2021-12-28T15:53:30.306Z [INFO]  core: upgrading plugin information: plugins=[]
2021-12-28T15:53:30.306Z [INFO]  core: successfully setup plugin catalog: plugin-directory=/vault/plugins
2021-12-28T15:53:30.306Z [INFO]  core: successfully mounted backend: type=system path=sys/
2021-12-28T15:53:30.307Z [INFO]  core: successfully mounted backend: type=identity path=identity/
2021-12-28T15:53:30.307Z [INFO]  core: successfully mounted backend: type=cubbyhole path=cubbyhole/
2021-12-28T15:53:30.307Z [INFO]  core: successfully enabled credential backend: type=token path=token/
2021-12-28T15:53:30.308Z [TRACE] expiration.job-manager: created dispatcher: name=expire-dispatcher num_workers=200
2021-12-28T15:53:30.308Z [TRACE] expiration.job-manager: initialized dispatcher: num_workers=200
2021-12-28T15:53:30.308Z [INFO]  rollback: starting rollback manager
2021-12-28T15:53:30.308Z [TRACE] expiration.job-manager: created job manager: name=expire pool_size=200
2021-12-28T15:53:30.308Z [TRACE] expiration.job-manager: starting job manager: name=expire
2021-12-28T15:53:30.308Z [TRACE] expiration.job-manager: starting dispatcher
2021-12-28T15:53:30.308Z [INFO]  core: restoring leases
2021-12-28T15:53:30.308Z [DEBUG] identity: loading entities
2021-12-28T15:53:30.308Z [DEBUG] expiration: collecting leases
2021-12-28T15:53:30.308Z [DEBUG] identity: entities collected: num_existing=0
2021-12-28T15:53:30.308Z [DEBUG] expiration: leases collected: num_existing=0
2021-12-28T15:53:30.308Z [INFO]  expiration: lease restore complete
2021-12-28T15:53:30.308Z [INFO]  identity: entities restored
2021-12-28T15:53:30.308Z [DEBUG] identity: identity loading groups
2021-12-28T15:53:30.308Z [DEBUG] identity: groups collected: num_existing=0
2021-12-28T15:53:30.309Z [INFO]  identity: groups restored
2021-12-28T15:53:30.309Z [DEBUG] identity: identity loading OIDC clients
2021-12-28T15:53:30.309Z [TRACE] activity: scanned existing logs: out=[]
2021-12-28T15:53:30.309Z [DEBUG] core: request forwarding setup function
2021-12-28T15:53:30.309Z [DEBUG] core: clearing forwarding clients
2021-12-28T15:53:30.309Z [DEBUG] core: done clearing forwarding clients
2021-12-28T15:53:30.309Z [DEBUG] core: request forwarding not setup
2021-12-28T15:53:30.309Z [DEBUG] core: leaving request forwarding setup function
2021-12-28T15:53:30.309Z [INFO]  core: post-unseal setup complete
2021-12-28T15:53:30.309Z [TRACE] activity: no intent log found
2021-12-28T15:53:30.309Z [INFO]  core: vault is unsealed
2021-12-28T15:53:30.309Z [TRACE] activity: scanned existing logs: out=[]
2021-12-28T15:53:35.375Z [INFO]  core: enabled credential backend: path=kubernetes/ type=kubernetes
2021-12-28T15:53:35.508Z [INFO]  core: successful mount: namespace="\"\"" path=secret/ type=kv
2021-12-28T15:53:35.519Z [INFO]  secrets.kv.kv_b845d8de: collecting keys to upgrade
2021-12-28T15:53:35.519Z [INFO]  secrets.kv.kv_b845d8de: done collecting keys: num_keys=1
2021-12-28T15:53:35.519Z [DEBUG] secrets.kv.kv_b845d8de: upgrading keys: progress=0/1
2021-12-28T15:53:35.519Z [INFO]  secrets.kv.kv_b845d8de: upgrading keys finished
2021-12-28T15:53:35.655Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: starting plugin: metadata=true path=/vault/plugins/vault-secrets-gen args=["/vault/plugins/vault-secrets-gen"]
2021-12-28T15:53:35.655Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin started: metadata=true path=/vault/plugins/vault-secrets-gen pid=68
2021-12-28T15:53:35.655Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: waiting for RPC address: metadata=true path=/vault/plugins/vault-secrets-gen
2021-12-28T15:53:35.674Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: using plugin: metadata=true version=4
2021-12-28T15:53:35.674Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen.vault-secrets-gen: plugin address: metadata=true address=/tmp/plugin3863194556 network=unix timestamp=2021-12-28T15:53:35.674Z
2021-12-28T15:53:35.676Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: setup: transport=gRPC status=started
2021-12-28T15:53:35.676Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen.stdio: waiting for stdio data: metadata=true
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: setup: transport=gRPC status=finished err=<nil> took=2.725419ms
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: special paths: transport=gRPC status=started
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: special paths: transport=gRPC status=finished took="304.808µs"
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: type: transport=gRPC status=started
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: type: transport=gRPC status=finished took="198.136µs"
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: cleanup: transport=gRPC status=started
2021-12-28T15:53:35.679Z [TRACE] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: cleanup: transport=gRPC status=finished took="240.116µs"
2021-12-28T15:53:35.679Z [WARN]  secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2021-12-28T15:53:35.679Z [WARN]  secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin failed to exit gracefully: metadata=true
2021-12-28T15:53:35.680Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen.stdio: received EOF, stopping recv loop: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2021-12-28T15:53:35.704Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin process exited: metadata=true path=/vault/plugins/vault-secrets-gen pid=68 error="signal: killed"
2021-12-28T15:53:35.705Z [INFO]  core: successful mount: namespace="\"\"" path=gen/ type=vault-secrets-gen
2021-12-28T15:53:35.716Z [INFO]  core: successful mount: namespace="\"\"" path=pki/ type=pki
2021-12-28T15:53:36.421Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T15:53:37.909Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T15:53:40.395Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T15:53:44.700Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T15:53:53.100Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T15:54:09.426Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T15:54:30.371Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: starting plugin: path=/vault/plugins/vault-secrets-gen args=["/vault/plugins/vault-secrets-gen"]
2021-12-28T15:54:30.371Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin started: path=/vault/plugins/vault-secrets-gen pid=77
2021-12-28T15:54:30.371Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: waiting for RPC address: path=/vault/plugins/vault-secrets-gen
2021-12-28T15:54:41.819Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]

Any request to the plugin causes the following logs:

vault write gen/password length=64
Error writing data to gen/password: context deadline exceeded
kubectl -n vault logs vault-0 vault
2021-12-28T15:59:43.329Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T16:00:06.941Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: starting plugin: path=/vault/plugins/vault-secrets-gen args=["/vault/plugins/vault-secrets-gen"]
2021-12-28T16:00:06.942Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin started: path=/vault/plugins/vault-secrets-gen pid=119
2021-12-28T16:00:06.942Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: waiting for RPC address: path=/vault/plugins/vault-secrets-gen
2021-12-28T16:00:43.703Z [INFO]  core: mount tuning of options: path=secret/ options=map[version:2]
2021-12-28T16:01:06.898Z [INFO]  expiration: revoked lease: lease_id=sys/wrapping/wrap/h3d172706af4b8855e10908d005f3aaed29b11ae6d3397436d882629a2096fa14
2021-12-28T16:01:06.943Z [ERROR] core: failed to run existence check: error="timeout while waiting for plugin to start"
2021-12-28T16:01:06.955Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin process exited: path=/vault/plugins/vault-secrets-gen pid=119 error="signal: killed"
2021-12-28T16:01:07.004Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: starting plugin: path=/vault/plugins/vault-secrets-gen args=["/vault/plugins/vault-secrets-gen"]
2021-12-28T16:01:07.004Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: plugin started: path=/vault/plugins/vault-secrets-gen pid=124
2021-12-28T16:01:07.004Z [DEBUG] secrets.vault-secrets-gen.vault-secrets-gen_8998d9fc.vault-secrets-gen: waiting for RPC address: path=/vault/plugins/vault-secrets-gen

Have you tried with 0.1.5? Any time there's a Vault release,
I have to update dependencies and compile a new version. From the logs, it looks like Vault is failing to launch the binary. The other thing you could try is to make sure the binary is executable.

I upgraded to 0.1.5.
Unfortunately, the issue is still present.

As noted above, the weird thing is that it runs fine in my remote cluster...
I thought the storage backend file might have been the issue and changed it to inmem, but it it had no effect.

Interesting. I'm really not sure the best way to debug this. This plug-in just uses Vault's SDK and isn't doing anything crazy. The only thing I can think of is something with /dev/urand not being available in your local docker setup for some reason. You've already enabled trace logging on the vault server, and that also didn't reveal anything super meaningful either.

Hi @Poweranimal - I just released v0.1.6 (available once this run finishes. I don't think it will fix the issue, but I added some top-level error handling in case the plugin panics. I think that's probably what is happening here, since Vault is re-spawning the plugin and we aren't seeing any output.

Can you give v0.1.6 a try and see if that generates any additional log output for you?

Hi @sethvargo ,
Thanks a lot for the new release.
I installed the latests release 1.0.6, but I cannot see any logs produced by catching a panic.

The more time passes, the more I think it must be caused by some tiny stupid detail on my side.
I'm going to investigate this further towards the end of this week or next week and I will let you know about my findings.

This issue is stale because it has been open for 14 days with no
activity. It will automatically close after 7 more days of inactivity.