TypeError: OrthogonalRegularizer.__init__() got an unexpected keyword argument 'num_features'
Allaye opened this issue · 2 comments
Describe the bug
I encountered a TypeError
while trying to load the pointnet_segmentation
model from Hugging Face using the from_pretrained_keras
function. The error message indicates that the __init__
method of OrthogonalRegularizer
does not accept the num_features
argument.
Reproduction
Install the required packages
!pip install -U keras==2.12.0 huggingface_hub
Import the necessary library
from huggingface_hub import from_pretrained_keras
Attempting to load the PointNet segmentation model
model = from_pretrained_keras("keras-io/pointnet_segmentation")
Logs
TypeError: OrthogonalRegularizer.__init__() got an unexpected keyword argument 'num_features'
TypeError Traceback (most recent call last)
<ipython-input-4-e8c7f05e3ba1> in <cell line: 4>()
2 # !pip install -U keras==2.12.0 huggingface_hub
3 from huggingface_hub import from_pretrained_keras
----> 4 model = from_pretrained_keras("keras-io/pointnet_segmentation")
5 #Wmodel.summary("keras/")
6 frames
/usr/local/lib/python3.10/dist-packages/tf_keras/src/regularizers.py in from_config(cls, config)
188 A regularizer instance.
189 """
--> 190 return cls(**config)
191
192 def get_config(self):
TypeError: OrthogonalRegularizer.__init__() got an unexpected keyword argument 'num_features'
System info
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: 2.9.2
- aiohttp: 3.11.2
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /root/.cache/huggingface/hub
- HF_ASSETS_CACHE: /root/.cache/huggingface/assets
- HF_TOKEN_PATH: /root/.cache/huggingface/token
- HF_STORED_TOKENS_PATH: /root/.cache/huggingface/stored_tokens
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
{'huggingface_hub version': '0.26.2',
'Platform': 'Linux-6.1.85+-x86_64-with-glibc2.35',
'Python version': '3.10.12',
'Running in iPython ?': 'Yes',
'iPython shell': 'Shell',
'Running in notebook ?': 'Yes',
'Running in Google Colab ?': 'Yes',
'Running in Google Colab Enterprise ?': 'No',
'Token path ?': '/root/.cache/huggingface/token',
'Has saved token ?': False,
'Configured git credential helpers': '',
'FastAI': '2.7.18',
'Tensorflow': '2.17.1',
'Torch': '2.5.1+cu121',
'Jinja2': '3.1.4',
'Graphviz': '0.20.3',
'keras': '3.5.0',
'Pydot': '3.0.2',
'Pillow': '11.0.0',
'hf_transfer': 'N/A',
'gradio': 'N/A',
'tensorboard': 'N/A',
'numpy': '1.26.4',
'pydantic': '2.9.2',
'aiohttp': '3.11.2',
'ENDPOINT': 'https://huggingface.co',
'HF_HUB_CACHE': '/root/.cache/huggingface/hub',
'HF_ASSETS_CACHE': '/root/.cache/huggingface/assets',
'HF_TOKEN_PATH': '/root/.cache/huggingface/token',
'HF_STORED_TOKENS_PATH': '/root/.cache/huggingface/stored_tokens',
'HF_HUB_OFFLINE': False,
'HF_HUB_DISABLE_TELEMETRY': False,
'HF_HUB_DISABLE_PROGRESS_BARS': None,
'HF_HUB_DISABLE_SYMLINKS_WARNING': False,
'HF_HUB_DISABLE_EXPERIMENTAL_WARNING': False,
'HF_HUB_DISABLE_IMPLICIT_TOKEN': False,
'HF_HUB_ENABLE_HF_TRANSFER': False,
'HF_HUB_ETAG_TIMEOUT': 10,
'HF_HUB_DOWNLOAD_TIMEOUT': 10}
Hi @Allaye , I'm sorry you're facing this issue. This error doesn't seem related to huggingface_hub
itself but to an incompatibility between the model and the keras/tensorflow version. The error occurs while loading the model from the downloaded file using the keras method. What I would advice is to open an issue on the model Discussion tab: https://huggingface.co/keras-io/pointnet_segmentation/discussions. Note that I'm not sure these models are really well maintained given they are ~3 years old and the Keras got heavily updated since then.
I'll close this issue but let me know if you have any further question.