LocalAI integration (local OpenAI drop-in replacement)
Closed this issue · 2 comments
netandreus commented
Is your feature request related to a problem? Please describe.
Now I can use only official OpenAI servers.
Describe the solution you'd like
I would like to use my own local OpenAI server (based on localAI) with the same API. As API is the same you should only add these settings to UI:
openAiBaseUrl
URL: likehttp://localhost:8080/api
Default:https://api.openai.com/v1
- Allow to type in
openIdModel
custom model name. Default:gpt-3.5-turbo
andris9 commented
The next release (#378) adds the following updates:
- You can set the API base URL (
openAiAPIUrl
configuration key), both in the API and in the web UI - You can set a custom model (
openAiModel
configuration key), but only through the API. If a custom value is set with the API, then the custom value appears in the model selection list.
curl -X 'POST' \
'https://emailengine.example.com/v1/settings' \
-H 'Authorization: Bearer <token>' \
-H 'Content-Type: application/json' \
-d '{
"openAiModel": "gpt-3.5-turbo",
"openAiAPIUrl": "https://api.openai.com"
}'
netandreus commented
Thanks a lot!