Error processing text: Error: Embed model not set when using ollama.
Closed this issue · 1 comments
remisharrock commented
-
I'm submitting a ...
[ ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[X ] question about how to use this project -
Summary
I modified the example vectordb.ts to use ollama:
const ai = axAI('ollama', {
model: 'llama3:latest',
url: 'http://localhost:11434'
} as AxOllamaArgs);
but got this error:
PS C:\Users\remis\ax\src\examples> bun run .\vectordb.ts
98 |
99 | // Batch upsert embeddings
100 | await this.db.batchUpsert(embeddings);
101 | }
102 | } catch (error) {
103 | throw new Error(`Error processing text: ${error}`);
^
error: Error processing text: Error: Embed model not set
at C:\Users\remis\ax\src\docs\manager.ts:103:13
- Other information (e.g. detailed explanation, stack traces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
Indeed in ollama i have only one model: 'llama3:latest' installed. Should I install an embedding model like explained here https://ollama.com/blog/embedding-models
But then how do I configure this model in here:
export type AxOllamaArgs = {
model: string;
url?: string;
apiKey?: string;
config?: Readonly<Omit<AxOllamaAIConfig, 'model'>>;
options?: Readonly<AxAIServiceOptions>;
};
thanks for the help!
dosco commented
I will set two default models on ollama in the default config but you are free to change them as below
model: 'nous-hermes2',
embedModel: 'all-minilm'
const ai = axAI('ollama', { model: 'nous-hermes2' , embedModel: 'mxbai-embed-large' });