external-secrets/kubernetes-external-secrets

Lots of info messages in the log

jalateras opened this issue ยท 6 comments

Is it usual that when i run KES, wth info streaming i get thousands of these messages. I am running 7.2.1

{"level":30,"message_time":"2021-05-07T06:49:38.671Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.673Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.674Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.676Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.678Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.679Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.681Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.682Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.684Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.686Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.687Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.689Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.690Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.692Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.693Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.695Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.696Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.698Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.700Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.702Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.703Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.705Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.707Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.708Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.710Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.711Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.713Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.715Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.716Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.718Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.719Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.721Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.723Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.724Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}
{"level":30,"message_time":"2021-05-07T06:49:38.726Z","pid":18,"hostname":"external-secrets-f7944bcdb-m7kx8","msg":"Stopping watch stream for namespace * due to event: END"}

No not at the same second no ๐Ÿ˜…
Provide some more details on your environment.

@Flydiverny Running on minikube backing on to AWS Sectrets Manager. Here is the deploy pod spec

Name:         external-secrets-5dd474df5b-xxt5k
Namespace:    external-secrets
Priority:     0
Node:         minikube/192.168.49.2
Start Time:   Fri, 07 May 2021 17:02:49 +1000
Labels:       name=external-secrets
              pod-template-hash=5dd474df5b
Annotations:  <none>
Status:       Running
IP:           172.17.0.6
IPs:
  IP:           172.17.0.6
Controlled By:  ReplicaSet/external-secrets-5dd474df5b
Containers:
  external-secrets:
    Container ID:   docker://5d08581b35b10933ecc2d106a49f304ecf3d6ea7683553e8e5f9fe0e52332e7a
    Image:          ghcr.io/external-secrets/kubernetes-external-secrets:7.2.1
    Image ID:       docker-pullable://ghcr.io/external-secrets/kubernetes-external-secrets@sha256:1115b0a00a45f7ea82f355fc50f974cca13954e6a641de71467fba61adc433b9
    Port:           3001/TCP
    Host Port:      0/TCP
    State:          Running
      Started:      Wed, 12 May 2021 07:56:16 +1000
    Last State:     Terminated
      Reason:       Error
      Exit Code:    255
      Started:      Fri, 07 May 2021 17:02:50 +1000
      Finished:     Wed, 12 May 2021 07:55:29 +1000
    Ready:          True
    Restart Count:  1
    Environment:
      AWS_DEFAULT_REGION:     ap-southeast-2
      AWS_REGION:             ap-southeast-2
      AWS_ACCESS_KEY_ID:      xxxxx
      AWS_SECRET_ACCESS_KEY:  xxxxx
      LOG_LEVEL:              warn
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from external-secrets-token-6xlnn (ro)
Conditions:
  Type              Status
  Initialized       True 
  Ready             True 
  ContainersReady   True 
  PodScheduled      True 
Volumes:
  external-secrets-token-6xlnn:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  external-secrets-token-6xlnn
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
                 node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
  Type    Reason          Age   From     Message
  ----    ------          ----  ----     -------
  Normal  SandboxChanged  14m   kubelet  Pod sandbox changed, it will be killed and re-created.
  Normal  Pulled          14m   kubelet  Container image "ghcr.io/external-secrets/kubernetes-external-secrets:7.2.1" already present on machine
  Normal  Created         14m   kubelet  Created container external-secrets
  Normal  Started         14m   kubelet  Started container external-secrets

I'm seeing the same messages, every ~300ms.

apiVersion: apps/v1
kind: Deployment
metadata:
  name: kubernetes-external-secrets
  namespace: kube-system
  labels:
    app.kubernetes.io/name: kubernetes-external-secrets
    helm.sh/chart: kubernetes-external-secrets-8.0.1
spec:
  replicas: 1
  selector:
    matchLabels:
      app.kubernetes.io/name: kubernetes-external-secrets
  template:
    metadata:
      labels:
        app.kubernetes.io/name: kubernetes-external-secrets
      annotations:
        iam.amazonaws.com/role: arn:aws:iam::<redacted>:role/kube-external-secrets
    spec:
      serviceAccountName: kubernetes-external-secrets
      containers:
        - name: kubernetes-external-secrets
          image: "ghcr.io/external-secrets/kubernetes-external-secrets:8.0.1"
          ports:
          - name: prometheus
            containerPort: 3001
          imagePullPolicy: IfNotPresent
          env:
          - name: "AWS_DEFAULT_REGION"
            value: "us-east-1"
          - name: "AWS_REGION"
            value: "us-east-1"
          - name: "LOG_LEVEL"
            value: "info"
          - name: "LOG_MESSAGE_KEY"
            value: "msg"
          - name: "METRICS_PORT"
            value: "3001"
          - name: "POLLER_INTERVAL_MILLISECONDS"
            value: "10000"
          - name: "WATCH_TIMEOUT"
            value: "60000"
      securityContext:
        runAsNonRoot: true

Running on kube v1.20.7+k3s1

@jalateras @Flydiverny

I solved this on my end. I realized I didn't have the CRDs installed which was what I think causes this issue. As soon as I installed them the log spamming stopped.

For me this was because the command in the README for non-Helm:

helm template --output-dir ./output_dir ./charts/kubernetes-external-secrets/

doesn't include the CRD yaml.

Updated the command suggestion in the readme to use --include-crds

KES now checks for CRD to be available on startup in 8.1.1