panic in releaser.go:189
huybrechts opened this issue · 6 comments
I tried this out on Kubernetes 1.21.10 cluster (RKE, VMware vSphere).
I followed the examples, and the first pod starts nicely, with the generated PVC, and VSphere CSI dynamically provisioned a volume.
However, after removing the pod, and starting a new one, a new volume is created instead of the old one reused.
The first volume is retained in state 'Released'.
The releaser logs show the error below. Am I missing something ?
I0426 15:22:50.780443 1 leaderelection.go:243] attempting to acquire leader lease default/releaser-reclaimable-pv-releaser...
I0426 15:23:54.735339 1 leaderelection.go:253] successfully acquired lease default/releaser-reclaimable-pv-releaser
I0426 15:23:54.735447 1 leader.go:82] I am the leader now: 34b83774-3474-4921-bc81-cd3c9bbe6d31
I0426 15:23:54.735549 1 releaser.go:60] Releaser starting...
I0426 15:23:54.735780 1 controller.go:243] Starting reclaimable-pv-releaser controller
I0426 15:23:54.836498 1 controller.go:252] Started reclaimable-pv-releaser controller
E0426 15:29:08.814156 1 runtime.go:78] Observed a panic: "assignment to entry in nil map" (assignment to entry in nil map)
goroutine 79 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic(0x160a5a0, 0x198d150)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/runtime/runtime.go:74 +0x95
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/runtime/runtime.go:48 +0x86
panic(0x160a5a0, 0x198d150)
/usr/local/go/src/runtime/panic.go:965 +0x1b9
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releaser.(*Releaser).pvAssociateHandler(0xc0003fd540, 0xc00025e500, 0x17fea71, 0x22)
/go/delivery/releaser/releaser.go:189 +0x87e
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releaser.(*Releaser).pvSyncHandler(0xc0003fd540, 0x0, 0x0, 0xc00012c0f0, 0x28, 0x1565340, 0xc00051a210)
/go/delivery/releaser/releaser.go:150 +0x4c5
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers.(*BasicController).ProcessNextWorkItem.func1(0x19e16f8, 0xc0004089a0, 0xc00017cc90, 0x1565340, 0xc00051a120, 0x0, 0x0)
/go/delivery/controller.go:312 +0x138
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers.(*BasicController).ProcessNextWorkItem(0xc0003fd540, 0x17e267b, 0x2, 0x19e16f8, 0xc0004089a0, 0xc00017cc90, 0x203001)
/go/delivery/controller.go:323 +0x1da
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers.(*BasicController).RunWorker.func1()
/go/delivery/controller.go:280 +0x99
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc00035c5c0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc00035c5c0, 0x19a13a0, 0xc0001b6420, 0x1, 0xc00009aea0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:156 +0x9b
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc00035c5c0, 0x3b9aca00, 0x0, 0x1, 0xc00009aea0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc00035c5c0, 0x3b9aca00, 0xc00009aea0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:90 +0x4d
created by github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releaser.(*Releaser).Run.func1
/go/delivery/releaser/releaser.go:107 +0x22f
panic: assignment to entry in nil map [recovered]
panic: assignment to entry in nil map
goroutine 79 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/runtime/runtime.go:55 +0x109
panic(0x160a5a0, 0x198d150)
/usr/local/go/src/runtime/panic.go:965 +0x1b9
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releaser.(*Releaser).pvAssociateHandler(0xc0003fd540, 0xc00025e500, 0x17fea71, 0x22)
/go/delivery/releaser/releaser.go:189 +0x87e
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releaser.(*Releaser).pvSyncHandler(0xc0003fd540, 0x0, 0x0, 0xc00012c0f0, 0x28, 0x1565340, 0xc00051a210)
/go/delivery/releaser/releaser.go:150 +0x4c5
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers.(*BasicController).ProcessNextWorkItem.func1(0x19e16f8, 0xc0004089a0, 0xc00017cc90, 0x1565340, 0xc00051a120, 0x0, 0x0)
/go/delivery/controller.go:312 +0x138
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers.(*BasicController).ProcessNextWorkItem(0xc0003fd540, 0x17e267b, 0x2, 0x19e16f8, 0xc0004089a0, 0xc00017cc90, 0x203001)
/go/delivery/controller.go:323 +0x1da
github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers.(*BasicController).RunWorker.func1()
/go/delivery/controller.go:280 +0x99
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc00035c5c0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc00035c5c0, 0x19a13a0, 0xc0001b6420, 0x1, 0xc00009aea0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:156 +0x9b
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc00035c5c0, 0x3b9aca00, 0x0, 0x1, 0xc00009aea0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc00035c5c0, 0x3b9aca00, 0xc00009aea0)
/go/pkg/mod/k8s.io/apimachinery@v0.21.2/pkg/util/wait/wait.go:90 +0x4d
created by github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releaser.(*Releaser).Run.func1
/go/delivery/releaser/releaser.go:107 +0x22f
Hey @huybrechts, thanks for the interest in this project.
Seems like DeepCopy
in https://github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/blob/main/releaser/releaser.go#L188 did not copy the labels for some reason, might be API incompatibility. Pretty sure the latest I tested this on was 1.19, got to try to upgrade dependencies first.
@huybrechts I just released https://github.com/plumber-cd/kubernetes-dynamic-reclaimable-pvc-controllers/releases/tag/v0.1.0-alpha1
I haven't had a chance to test it, all I did is bumped Go and dependencies versions. If you can give it a shot and let me know if it fixed the issue - that'd be much appreciated.
That was quick, thanks! I'll test.
@dee-kryvenko Same error, sorry.
I see, it was worth a shot. I'll have to find some time to debug this.
@huybrechts, upon closer look that was actually a simple NP bug in my code, so it wasn't related to the version of k8s. This is now fixed in v0.1.0
- please let me know if there's still any issues.