Mobile-Artificial-Intelligence/maid

Feature Request: Custom Openai endpoints

Detla7t opened this issue · 3 comments

Hi I recently downloaded your app and noticed that you do support openai what I was hoping is that you could open up access to the port options so that people can use openai compatible endpoints such as LM Studio, Oobabooga and others I think you would only need to give access so that the user can set a custom port unless there is a way that I didn't see. thanks and hope this isn't too much trouble.

So I was Messing around with the app a bit more and Figured it out at least for Oobabooga. what I had to do was make sure it was http not https and put v1 at the end so it looks like http://<ip-address>:5000/v1 and then a random api key. not sure if you want me to close this issue or not so for now I'll leave it open.

Ahhh, ok i think that's expected behavior. I cant make it work without the /v1 because that's all handled in langchain dart which is upstream from maid.

Not all providers that offer an OpenAI-compatible API include /v1, so I would keep it as part of the base URL.
E.g. Azure uses https://{YOUR_RESOURCE_NAME}.openai.azure.com/openai/deployments/{YOUR_DEPLOYMENT_NAME}/