acryldata/datahub-helm

Datahub Upgrade jobs shares environment variables

MadsHT opened this issue · 2 comments

MadsHT commented

Describe the bug
In values.yaml we can set the resources available to each job, but giving the pod more resources does not increase the JVM heap size, thus the application does not utilize the resources given (see issue#344.

So if I want to run the restore indecies but give it more memory then the 256Mi / 512Mi it has by default. I can change the resources available to the pod in values.yaml but the values.yaml files does not indicate a way to add environment variables to make the JVM use more memory.

I looked at the datahub-restore-indices-job-template.yml and found .Values.datahubUpgrade.extraEnvs, the problem here is that the environments are shared across all the upgrade jobs, so increasing the resources of the restore indecies job and adding _JAVA_OPTIONS: -Xms1024m -Xmx1536m as an environment variable in values.yaml results in all the other jobs trying to use as much memory as the environment variable specified, resulting in the jobs getting OOMKilled.

To Reproduce
Steps to reproduce the behavior:

  1. Increase the memory for restoreIndices to 1Gi
  2. Add
    - name: _JAVA_OPTIONS
       value: -Xms1024m -Xmx1536m
    to .Values.datahubUpgrade.extraEnvs in the values.yaml file.
  3. The nocode-migration-job should then try to use 1024m of memory and Kubernetes should OOMKill it.

I have not tested this I saw it as part of an upgrade.

Expected behavior
I expect the chart to allow me to add environment variables for each job.

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
Add any other context about the problem here.

This issue is stale because it has been open for 30 days with no activity. If you believe this is still an issue on the latest DataHub release please leave a comment with the version that you tested it with. If this is a question/discussion please head to https://slack.datahubproject.io. For feature requests please use https://feature-requests.datahubproject.io

This issue was closed because it has been inactive for 30 days since being marked as stale.