Introduce `managedBy` field
tenzen-y opened this issue · 21 comments
What would you like to be added:
Remove the alpha.jobset.sigs.k8s.io/managed-by
label and introduce .spec.managedBy
field.
@mimowo @ahg-g @alculquicondor WDYT?
/kind feature
Why is this needed:
Since #407, we introduced the new managed-by
label based on the batch/job KEP: kubernetes/enhancements#4370.
But, we decided to introduce a new managedBy
field to batch/job here: kubernetes/enhancements#4530
So, I think that we should use the managedBy
field instead of the dedicated label as well as batch/job.
oh, you are referring to a field. Yeah, that sounds good, consistency with job is good.
Should we revert that PR then?
+1 for replacing with a field.
I suppose the alpha annotation already made it to the release. I would suggest keeping both mechanism for at least one release, to facilitate upgrades to Kueue+Jobset.
+1 for replacing with a field.
I suppose the alpha annotation already made it to the release. I would suggest keeping both mechanism for at least one release, to facilitate upgrades to Kueue+Jobset.
That makes sense.
I tried a quick revert option on the PR but that failed so reverting may be a bit of a manual process..
+1 for using field, for ecosystem consistency, but also so that we fail fast when someone tries to use it on an old version (same argument as for the k8s job).
/assign
@trasc have you already started this? I was hoping to have a team member of mine work on this as a good first issue to ramp up on JobSet, we just hadn't assigned it to him yet, so i'd like to assign it to them if that's okay
@danielvegamyhre Not yet.
/unassign
Thanks!
/assign @jedwins1998
@danielvegamyhre: GitHub didn't allow me to assign the following users: jedwins1998.
Note that only kubernetes-sigs members with read permissions, repo collaborators and people who have commented on this issue/PR can be assigned. Additionally, issues/PRs can only have 10 assignees at the same time.
For more information please see the contributor guide
In response to this:
Thanks!
/assign @jedwins1998
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
/assign
What is our deprecation policy for the managed-by label?
What is our deprecation policy for the managed-by label?
I just realized this change was included in the v0.3.0 release. Given that, I think we should support both the label and the field for a period before removing the label. Maybe for a couple releases? And we can emit a warning the label is being deprecated at JobSet creation time. What do you think?
I brought this on kubernetes-sigs/kueue#1870 to figure out what we want to do.
Update: we can safely drop the label entirely per conversation here kubernetes-sigs/kueue#1870 (comment)
Some additional notes for implementation:
- Add a JobSetSpec field analogous to the one in the JobSpec here.
- Replace references to label with the field in jobset controller, unit tests, and integration tests
- Regenerate manifests (JobSet CRD, RBACs, service accounts etc which are generated from the API definition and kubebuilder markers):
make manifests
- Regenerate Go and Python SDKs:
make generate
- Ensure all unit, integration, e2e tests are passing:
- unit:
make test
- integration:
make test-integration
- e2e:
make test-e2e-kind
- unit:
- Run
make verify
to run linter, go vet, etc and make sure api/, config/, client-go/ are all updated together.
@jedwins1998 you can generalize this into an improved contributor guide like you were talking about as well
Created #487 as an initial implementation.
PR has been merged. Will this issue auto-close or is there something I need to do?
PR has been merged. Will this issue auto-close or is there something I need to do?
It will auto-close if you add "Fixes #issueNumber" in the issue description. We can close this one manually. Thanks for your work on this!