Feature: Move Spark Operator out of Reference Implementation
Opened this issue · 1 comments
Have you searched for this feature request?
- I searched but did not find similar requests
Problem Statement
Spark operator takes up resources and also doesn't deploy on Apple M series machines.
Possible Solution
Can we move the Spark Operator into its own stack?
The challenge comes from https://github.com/cnoe-io/stacks/tree/main/ref-implementation/backstage-templates/entities/argo-workflows. We use a spark job to demonstrate an Argo Workflow Backstage template. We would need to move this template into the spark operator repo (and potentially make another workflow example that doesn't need spark), but registering the template from the Spark repo is challenging as we use a configmap:
https://github.com/cnoe-io/stacks/blob/main/ref-implementation/backstage/manifests/install.yaml#L169
to link to the template catalog
So we would need to find a way to add this file to the catalog after deployment
Alternatives Considered
No response
I fully agree that Spark should be removed from the core
or reference
implementation