databrickslabs/dbx

dbx execute command fails from Runtime version 11.x

cristian-rincon opened this issue · 2 comments

Expected Behavior

DBX execute should run specific task into a data bricks cluster

Current Behavior

The command is failing with this error:

✅ Uploading package - done
⠦ Installing package on the cluster 📦[dbx][2023-07-13 17:11:12.437] Execution failed, please follow the given error
java.lang.IllegalStateException: jupyter client is not available because the python kernel is not defined. The kernel may be restarting or the repl may have been shut down.
        at com.databricks.backend.daemon.driver.JupyterDriverLocal.$anonfun$getJupyterKernelListener$2(JupyterDriverLocal.scala:307)
        at scala.Option.getOrElse(Option.scala:189)
        at com.databricks.backend.daemon.driver.JupyterDriverLocal.com$databricks$backend$daemon$driver$JupyterDriverLocal$$getJupyterKernelListener(JupyterDriverLocal.scala:306)
        at com.databricks.backend.daemon.driver.JupyterDriverLocal.executePython(JupyterDriverLocal.scala:562)
        at com.databricks.backend.daemon.driver.JupyterDriverLocal.repl(JupyterDriverLocal.scala:524)
        at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$24(DriverLocal.scala:879)
        at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
        at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$21(DriverLocal.scala:862)
        at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:412)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
        at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:158)
        at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:410)
        at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:407)
        at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:69)
        at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:455)
        at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:440)
        at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:69)
        at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:839)
        at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:660)
        at scala.util.Try$.apply(Try.scala:213)
        at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:652)
        at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:571)
        at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:606)
        at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:448)
        at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:389)
        at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:247)
        at java.lang.Thread.run(Thread.java:750)

Steps to Reproduce (for bugs)

  1. Launch the command: python -m dbx execute --deployment-file=./deployment.yml.j2 --jinja-variables-file=./vars.dbx.yml --cluster-name=MYCLUSTER 'My Workflow' --task 'my-task'

Context

Your Environment

  • dbx version used: 0.8.10
  • Databricks Runtime version: 11.3 LTS ML (includes Apache Spark 3.3.2, Scala 2.12), 12.2 LTS ML (includes Apache Spark 3.3.2, Scala 2.12)

hi @cristian-rincon
could you please upgrade dbx to the latest version (0.8.17) and check if that works for you?
There was an issue which seemed to be similar, but I've resolved it in one of the versions.

That worked. thanks