pezzolabs/pezzo

Customizing OpenAI Models and Endpoint

ImYrS opened this issue · 7 comments

Proposal

  • Able to set model name or model list manually.
  • Able to set OpenAI Base URL manually.

Use-Case

For self-hosted, some people needs to set another OpenAI Endpoint. Like in China, api.openai.com is blocked.

And more, many user use a project called one-api to handle many models from different providers. That project can converts that models to OpenAI-compatible API, so users can use the models not from OpenAI or Claude to test prompt.

In summary, I wish these two features can be developed, and I think it is useful for many users.

Is this a feature you are interested in implementing yourself?

Maybe

Hi ImYrS, I understand the reasoning behind this change. Is this something you'd like to contribute to? This has to do with the proxy service.

Hi, glad to hear from you. But I'm sorry I don't have time to contribute at the moment.

My personal understanding is that this doesn't really require a change proxy service, maybe add some ENV vars for API Endpoint or something. And allow custom input model name.

Since I haven't read the code of this project completely, my understanding may be wrong, please point out any problems, thank you very much!