Can't upgrade to k8s.io/api v0.28.0-alpha.4
tonsV2 opened this issue · 3 comments
Dependabot is trying to upgrade my project with the following dependencies
k8s.io/api v0.28.0-alpha.4
k8s.io/apimachinery v0.28.0-alpha.4
k8s.io/client-go v0.28.0-alpha.4
However when I try to compile I get the following error
#14 [build 8/8] RUN go build -o /app/im-manager -ldflags "-s -w" ./cmd/serve
#14 31.08 # sigs.k8s.io/kustomize/kyaml/openapi
#14 31.08 /go/pkg/mod/sigs.k8s.io/kustomize/kyaml@v0.13.9/openapi/openapi.go:656:33: cannot use doc (variable of type *"github.com/google/gnostic/openapiv2".Document) as *"github.com/google/gnostic-models/openapiv2".Document value in argument to swagger.FromGnostic
#14 ERROR: process "/bin/sh -c go build -o /app/im-manager -ldflags \"-s -w\" ./cmd/serve" did not complete successfully: exit code: 1
The project in question can be found here.
Please let me know if I should add more information or ask in a different place.
May be this kubernetes/client-go#1075 would help to resolve your issue.
The Kubernetes project currently lacks enough contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/stale
is applied - After 30d of inactivity since
lifecycle/stale
was applied,lifecycle/rotten
is applied - After 30d of inactivity since
lifecycle/rotten
was applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle stale
- Close this issue with
/close
- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/stale
is applied - After 30d of inactivity since
lifecycle/stale
was applied,lifecycle/rotten
is applied - After 30d of inactivity since
lifecycle/rotten
was applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle rotten
- Close this issue with
/close
- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle rotten