kubeflow/spark-operator

[BUG] spark-operator-spark serviceaccount lacks necessary permissions

peter-mcclonski opened this issue · 2 comments

Description

Execution of example jobs fails due to insufficient privileges for the spark-operator-spark service account in the default namespace.

  • ✋ I have searched the open/closed issues and my issue is not listed.

Reproduction Code [Required]

Steps to reproduce the behavior:

Expected behavior

Documentation for the examples should explain the need to specify --set sparkJobNamespaces=default, OR the default value of sparkJobNamespaces should be [default]

Actual behavior

The job fails due to the configured service account lacking access to the default namespace.

Terminal Output Screenshot(s)

image

Environment & Versions

  • Spark Operator App version: 1.4.5
  • Helm Chart Version: 1.2.14
  • Kubernetes Version: 1.29.3
  • Apache Spark version: 3.5.0

Additional context

The core issue here seems to be reflected in this section of the docs: https://github.com/kubeflow/spark-operator/blob/master/docs/quick-start-guide.md#about-the-spark-job-namespace

The bug is effectively an incorrect default in the values.yaml. We supply an empty list, rather than [""]

I believe this was introduced by #1988