vikhyat/moondream

Error when using python code example

Zolilio opened this issue · 1 comments

I guess it's an error from my side, but when I try to run the python code given by the huggingface page of the model (https://huggingface.co/vikhyatk/moondream2), it give me this error :

Traceback (most recent call last):

  File ~\miniconda3\lib\site-packages\spyder_kernels\py3compat.py:356 in compat_exec
    exec(code, globals, locals)

  File c:\users\XXX\onedrive\bureau\m2\stage\python\code\deep_learning\image_interogator_moondream2.py:6
    model = AutoModelForCausalLM.from_pretrained(

  File ~\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\auto_factory.py:456 in from_pretrained
    config, kwargs = AutoConfig.from_pretrained(

  File ~\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\configuration_auto.py:955 in from_pretrained
    return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs)

  File ~\AppData\Roaming\Python\Python310\site-packages\transformers\configuration_utils.py:554 in from_pretrained
    return cls.from_dict(config_dict, **kwargs)

  File ~\AppData\Roaming\Python\Python310\site-packages\transformers\configuration_utils.py:725 in from_dict
    logger.info(f"Model config {config}")

  File ~\AppData\Roaming\Python\Python310\site-packages\transformers\configuration_utils.py:757 in __repr__
    return f"{self.__class__.__name__} {self.to_json_string()}"

  File ~\AppData\Roaming\Python\Python310\site-packages\transformers\configuration_utils.py:843 in to_json_string
    return json.dumps(config_dict, indent=2, sort_keys=True) + "\n"

  File ~\miniconda3\lib\json\__init__.py:238 in dumps
    **kw).encode(obj)

  File ~\miniconda3\lib\json\encoder.py:201 in encode
    chunks = list(chunks)

  File ~\miniconda3\lib\json\encoder.py:431 in _iterencode
    yield from _iterencode_dict(o, _current_indent_level)

  File ~\miniconda3\lib\json\encoder.py:405 in _iterencode_dict
    yield from chunks

  File ~\miniconda3\lib\json\encoder.py:438 in _iterencode
    o = _default(o)

  File ~\miniconda3\lib\json\encoder.py:179 in default
    raise TypeError(f'Object of type {o.__class__.__name__} '

TypeError: Object of type PhiConfig is not JSON serializable

Here is the code of the page:

from transformers import AutoModelForCausalLM, AutoTokenizer
from PIL import Image

model_id = "vikhyatk/moondream2"
revision = "2024-05-20"
model = AutoModelForCausalLM.from_pretrained(
    model_id, trust_remote_code=True, revision=revision
)
tokenizer = AutoTokenizer.from_pretrained(model_id, revision=revision)

image = Image.open('<IMAGE_PATH>')
enc_image = model.encode_image(image)
print(model.answer_question(enc_image, "Describe this image.", tokenizer))

am facing the same issue, am using transformers = 4.31.0 and eionps = 0.8.0