databrickslabs/dbx

Error when import NamedJobsService

walid781 opened this issue · 4 comments

Expected Behavior

Current Behavior

Steps to Reproduce (for bugs)

when import in databricks notebook ( from dbx.api.services.jobs import NamedJobsService ), I got this error message:
TypeError: issubclass() arg 1 must be a class

While this import work perfectly when running on local or in AWS lambda.

Context

run notebook in databricks

  • dbx version used: 0.8.7
  • Databricks Runtime version: 12.1

Hi, i'm getting the same error using:

dbx version: 0.8.14
Databricks runtime: 11.3.x-cpu-ml-scala2.12

Expected Behavior

Current Behavior

Steps to Reproduce (for bugs)

when import in databricks notebook ( from dbx.api.services.jobs import NamedJobsService ), I got this error message: TypeError: issubclass() arg 1 must be a class

While this import work perfectly when running on local or in AWS lambda.

Context

run notebook in databricks

  • dbx version used: 0.8.7
  • Databricks Runtime version: 12.1

It's working again in Databriks notebook, without any modification from my side. Maybe it was a temporary issue.

I am having the same issue.

The following is my environment:

  • cloud: Azure
  • dbx: 0.8.6
  • spark_version: 11.3.x-cpu-ml-scala2.12

Complete Trace

TypeError: issubclass() arg 1 must be a class
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<command--1> in <cell line: 12>()
     11 
     12 with open(filename, "rb") as f:
---> 13   exec(compile(f.read(), filename, 'exec'))
     14 

/tmp/tmp_ytej91r.py in <module>
      1 import os
      2 from databricks_cli.sdk import ApiClient
----> 3 from dbx.api.launch.runners.standard import StandardLauncher
      4 from dbx.models.cli.options import ExistingRunsOption
      5 from brz_smk_sku_demand_forecast.common import Task

/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    169             # Import the desired module. If you’re seeing this while debugging a failed import,
    170             # look at preceding stack frames for relevant error information.
--> 171             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    172 
    173             is_root_import = thread_local._nest_level == 1

/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/dbx/api/launch/runners/standard.py in <module>
      6 from dbx.api.launch.functions import wait_run, cancel_run
      7 from dbx.api.launch.runners.base import RunData
----> 8 from dbx.api.services.jobs import NamedJobsService
      9 from dbx.models.cli.options import ExistingRunsOption
     10 from dbx.models.workflow.v2dot0.parameters import StandardRunPayload as V2dot0StandardRunPayload

/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    169             # Import the desired module. If you’re seeing this while debugging a failed import,
    170             # look at preceding stack frames for relevant error information.
--> 171             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    172 
    173             is_root_import = thread_local._nest_level == 1

/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/dbx/api/services/jobs.py in <module>
      6 
      7 from dbx.api.adjuster.mixins.base import ApiClientMixin
----> 8 from dbx.api.services._base import WorkflowBaseService
      9 from dbx.models.workflow.common.flexible import FlexibleModel
     10 from dbx.models.workflow.v2dot0.workflow import Workflow as V2dot0Workflow

/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    169             # Import the desired module. If you’re seeing this while debugging a failed import,
    170             # look at preceding stack frames for relevant error information.
--> 171             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    172 
    173             is_root_import = thread_local._nest_level == 1

/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/dbx/api/services/_base.py in <module>
      3 
      4 from dbx.api.adjuster.mixins.base import ApiClientMixin
----> 5 from dbx.models.deployment import AnyWorkflow
      6 
      7 

/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    169             # Import the desired module. If you’re seeing this while debugging a failed import,
    170             # look at preceding stack frames for relevant error information.
--> 171             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    172 
    173             is_root_import = thread_local._nest_level == 1

/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/dbx/models/deployment.py in <module>
     12 from dbx.models.files.project import EnvironmentInfo
     13 from dbx.models.workflow.common.flexible import FlexibleModel
---> 14 from dbx.models.workflow.common.pipeline import Pipeline
     15 from dbx.models.workflow.common.workflow_types import WorkflowType
     16 from dbx.models.workflow.v2dot0.workflow import Workflow as V2dot0Workflow

/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    169             # Import the desired module. If you’re seeing this while debugging a failed import,
    170             # look at preceding stack frames for relevant error information.
--> 171             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    172 
    173             is_root_import = thread_local._nest_level == 1

/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/dbx/models/workflow/common/pipeline.py in <module>
     35 
     36 
---> 37 class Pipeline(AccessControlMixin):
     38     name: str
     39     pipeline_id: Optional[str]

/databricks/python/lib/python3.9/site-packages/pydantic/main.cpython-39-x86_64-linux-gnu.so in pydantic.main.ModelMetaclass.__new__()

/databricks/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.infer()

/databricks/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.__init__()

/databricks/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.prepare()

/databricks/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField._type_analysis()

/usr/lib/python3.9/typing.py in __subclasscheck__(self, cls)
    833             return issubclass(cls.__origin__, self.__origin__)
    834         if not isinstance(cls, _GenericAlias):
--> 835             return issubclass(cls, self.__origin__)
    836         return super().__subclasscheck__(cls)
    837 

TypeError: issubclass() arg 1 must be a class

Apparently the problem is with pydantic. Coincidentally the spark_version 11.3.x-cpu-ml-scala2.12 has installed the problematic pydantic version.

Steps to reproduce the issue:

pyenv install 3.9.16
pyenv virtualenv 3.9.16 dbx-pydantic
pyenv activate dbx-pydantic
pip install pydantic==1.9.2
pip install dbx==0.8.6
python
>>> from dbx.api.services.jobs import NamedJobsService
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/dgarridoa/.pyenv/versions/dbx-pydantic/lib/python3.9/site-packages/dbx/api/services/jobs.py", line 8, in <module>
    from dbx.api.services._base import WorkflowBaseService
  File "/home/dgarridoa/.pyenv/versions/dbx-pydantic/lib/python3.9/site-packages/dbx/api/services/_base.py", line 5, in <module>
    from dbx.models.deployment import AnyWorkflow
  File "/home/dgarridoa/.pyenv/versions/dbx-pydantic/lib/python3.9/site-packages/dbx/models/deployment.py", line 14, in <module>
    from dbx.models.workflow.common.pipeline import Pipeline
  File "/home/dgarridoa/.pyenv/versions/dbx-pydantic/lib/python3.9/site-packages/dbx/models/workflow/common/pipeline.py", line 37, in <module>
    class Pipeline(AccessControlMixin):
  File "pydantic/main.py", line 205, in pydantic.main.ModelMetaclass.__new__
  File "pydantic/fields.py", line 491, in pydantic.fields.ModelField.infer
  File "pydantic/fields.py", line 421, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 537, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 641, in pydantic.fields.ModelField._type_analysis
  File "/home/dgarridoa/.pyenv/versions/3.9.16/lib/python3.9/typing.py", line 852, in __subclasscheck__
    return issubclass(cls, self.__origin__)
TypeError: issubclass() arg 1 must be a class

Solved upgrading pydantic to 1.10 (1.10.8)

pip install -U pydantic