cyanff/anime.gf

Allow setting custom API endpoint

Closed this issue · 5 comments

Many local LLM servers such as llama.cpp's included server, and koboldcpp, offer an OpenAI-compatible server for the user's convenience. If we could change the endpoint URL for OpenAI, then it would support local inference right now.

o7 will add

here's the flow I'm thinking of.

user goes to chat settings -> 
click on provider <select> field -> 
clicks "Custom" ->
instead of showing a model list, a URL field is shown to enter the endpoint

let me know if this makes sense

I'd make sure to specify that you're expecting an OpenAI-compatible endpoint. But yeah, that sounds fine. That way, the OpenAI handler stays the same, and handles models the same.

such as Ollama...

ollama wraps llamacpp. All OpenAI compatible endpoints are the same. Why did you post this?

released with v0.0.2!
a bit of a naive implementation but I hope it works okay
https://github.com/cyanff/anime.gf/releases/tag/0.0.2
feel free to reopen or ping me if there are issues :>