linuxkit/kubernetes

e2e tests make apiserver crash on d4m

errordeveloper opened this issue · 1 comments

I noticed this while working #35. Not sure if this is because I run 1.9 e2e on 1.8, TBC.

E0105 16:31:58.397048       1 runtime.go:66] Observed a panic: "duplicate node port" (duplicate node port)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:72
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:65
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:51
/usr/local/go/src/runtime/asm_amd64.s:514
/usr/local/go/src/runtime/panic.go:489
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/registry/core/service/rest.go:598
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/registry/core/service/rest.go:341
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:910
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:1187
/usr/local/go/src/runtime/asm_amd64.s:2197
panic: duplicate node port [recovered]
	panic: duplicate node port

goroutine 3779 [running]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0xc42abd5f78, 0x1, 0x1)
	/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:58 +0x126
panic(0x2fd7c80, 0xc423a444e0)
	/usr/local/go/src/runtime/panic.go:489 +0x2cf
k8s.io/kubernetes/pkg/registry/core/service.(*REST).updateNodePorts(0xc420d34c30, 0xc42a2950e0, 0xc42a295860, 0xc42abd5e50, 0xc420d34c01, 0x810b380)
	/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/registry/core/service/rest.go:598 +0x262
k8s.io/kubernetes/pkg/registry/core/service.(*REST).Update(0xc420d34c30, 0x7f7690822d78, 0xc4256b1e00, 0xc42a315fc5, 0x17, 0x8113c80, 0xc429cb9480, 0x0, 0x0, 0xf02300, ...)
	/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/pkg/registry/core/service/rest.go:341 +0xb12
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.UpdateResource.func1.2(0xc400000018, 0x39db590, 0xc420d57778, 0x1)
	/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:910 +0x114
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.finishRequest.func1(0xc4239d5860, 0xc42ab0f5e0, 0xc4239d5800, 0xc4239d57a0)
	/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:1187 +0x99
created by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.finishRequest
	/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:1192 +0xd9

Looks like kubernetes/kubernetes#49258 has fixed the panic, and introduced test cases for that. So that's why we get this on 1.8. As this repo ships 1.9 now, I'll close this.