Support self-hosted models (non-OpenAI flavor)
cryoff opened this issue · 1 comments
cryoff commented
Proposal
It would be great to use non-OpenAI APIs. For example, in the real setting I can use Llama2 or even FlanT5, host it somewhere, expose API endpoint, set up pezzo proxy to proxy requests and submit the custom headers.
Right now a similar functionality is implemented in LangChain using custom callbacks. That could be a way as well.
Use-Case
No response
Is this a feature you are interested in implementing yourself?
Maybe