Fix Up Inference Code with colpali_engine==0.3.0
DrChrisLevy opened this issue · 1 comments
Hey!
I'm new to the repo/model and just trying it out so sorry if I'm being a noob.
I am using colpali_engine==0.3.0
.
I am trying to copy/paste code from the inference example and just simply run it.
Fails here since this exception is raised here
if not isinstance(processor, BaseVisualRetrieverProcessor):
raise ValueError("Processor should be a BaseVisualRetrieverProcessor")
type(processor)
----> <class 'transformers.models.paligemma.processing_paligemma.PaliGemmaProcessor'>
.
Then when it does dataloader here
it fails because processor.process_images(x)
. AttributeError: 'PaliGemmaProcessor' object has no attribute 'process_images'
Maybe the fix is this instead (as done in the README). ??
processor = cast(ColPaliProcessor, ColPaliProcessor.from_pretrained("google/paligemma-3b-mix-448"))
Hi @DrChrisLevy. You're right, there is a bug in the example script so at the time being you should refer to the example script from the README. I'm starting a PR to fix this problem!
Thanks for your message! 👋🏼