vanna-ai/vanna-flask

Run Vana with Locally created Chroma DB

Closed this issue · 1 comments

Running Vanna Flask code locally showing error
Traceback (most recent call last):
File "E:\Ekkel AI task practice\Google Gemini Project\Google-Gemini-Crash-Course\sqlllm\vanna-flask\app.py", line 38, in
vn.train(ddl=ddl)
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\vanna\base\base.py", line 1160, in train
return self.add_ddl(ddl)
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\vanna\chromadb\chromadb_vector.py", line 81, in add_ddl
embeddings=self.generate_embedding(ddl),
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\vanna\chromadb\chromadb_vector.py", line 55, in generate_embedding
embedding = self.embedding_function([data])
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\chromadb\utils\embedding_functions.py", line 426, in call
self._init_model_and_tokenizer()
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\chromadb\utils\embedding_functions.py", line 414, in _init_model_and_tokenizer
self.model = self.ort.InferenceSession(
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\AL RAFIO\miniconda3\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 452, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from C:\Users\AL RAFIO.cache\chroma\onnx_models\all-MiniLM-L6-v2\onnx\model.onnx failed:Protobuf parsing fa
filed.

But when I run it in colab do not show any error.

My issue was solved simply by following the Vana documentation.