Kav-K/GPTDiscord

Feature Request: Integration of Ollama Model

Opened this issue · 1 comments

Is your feature request related to a problem? Please describe.
Presently, GPTDiscord relies on the OpenAI GPT model, incurring costs through its API usage. For users looking for a cost-effective solution, this might pose a challenge.

Describe the solution you'd like
I suggest integrating the Ollama model alongside the existing OpenAI GPT model in GPTDiscord. Ollama is a high-performing and cost-free alternative, requiring local machine execution. This would give users the flexibility to choose between the OpenAI GPT model and the Ollama model based on their preferences and budget constraints.

Describe alternatives you've considered
An alternative could be sticking to the current setup with only the OpenAI GPT model. However, incorporating the Ollama model adds a valuable cost-effective option for users.

Additional context
I understand that introducing a non-GPT model like Ollama might seem like a deviation. I apologize for any confusion this might cause. However, the aim is to provide users with an additional, cost-effective option without compromising on performance. Ollama's integration aligns with the ethos of an open-source project, offering users flexibility and choice.

Link to Ollama for additional information

I think this request could be better for having support API endpoints that follow OpenAI API format, for example are like LM Studio https://lmstudio.ai/docs/local-server or text-generation-webui https://github.com/oobabooga/text-generation-webui . the implementation could be into we able to set custom endpoint url.

but it can be understandable why GPTDiscord maintainer / contributor don't want to implement this, since it will result in more (and unnecessary) issues which will cost you time and effort. so i guess the middle ground is to do it on your own by forking the repository and change the parts to direct the bot into your local model, as long as your LLM server follows OpenAI API format it should be reasonable / something possible to do.