TypeError: issubclass() arg 1 must be a class
BenisonSam opened this issue · 3 comments
BenisonSam commented
Following is my code:
from enum import Enum
from typing import Optional
from pydantic import Field, conint
from sparkdantic import SparkModel
class FooBar(SparkModel):
count: int = Field(..., title='Count')
size: Optional[float] = Field(None, title='Size')
class Gender(str, Enum):
male = 'male'
female = 'female'
other = 'other'
not_given = 'not_given'
class Main(SparkModel):
foo_bar: FooBar
Gender: Optional[Gender]
snap: Optional[conint(lt=50, gt=30)] = Field(
42, description='this is the value of snap', title='The Snap'
)
print(Main.model_spark_schema())
When I execute it, I get the following error:
Traceback (most recent call last):
File "model.py", line 27, in <module>
print(Main.model_spark_schema())
File "\env\lib\site-packages\sparkdantic\model.py", line 241, in model_spark_schema
t, nullable = cls._type_to_spark(v.annotation)
File "\env\lib\site-packages\sparkdantic\model.py", line 355, in _type_to_spark
if issubclass(t, Enum):
TypeError: issubclass() arg 1 must be a class
Is there something that I am missing or doing wrong? I am not able to debug or fix this issue.
BenisonSam commented
Further analysis shows that this error occurs for typing.Optional
. Could you please confirm and suggest a work around?
mitchelllisle commented
Hi there, its actually happening because you're using conint
. I'll work on fixing this. One thing to note though, I will add better support for the Annotated
type as opposed to pydantics implementation of conint
since they discourage people from using it anyway. See here for the warning they have in their code base.
Will add support for this type of implementation
class Foo(SparkModel):
bar: Annotated[int, Field(strict=True, gt=0)]
mitchelllisle commented