可以部署到docker和云函数的OpenAI API代理 Simple proxy for OpenAi api via a one-line docker command
🎉 已经支持SSE,可以实时返回内容
以下英文由GPT翻译。The following English was translated by GPT.
You can deploy ./app.js to any environment that supports nodejs 14+, such as cloud functions and edge computing platforms.
- Copy app.js and package.json to the directory
- Run yarn install to install dependencies
- Run node app.js to start the service.
docker run -p 9000:9000 easychen/ai.level06.com:latest
The proxy address is http://${IP}:9000.
- PORT: Service port.
- PROXY_KEY: Proxy access key used to restrict access.
- TIMEOUT: Request timeout, default is 5 seconds.
- Change the request address of OpenAI (https://api.openai.com) to the address of this proxy (without a slash).
- If PROXY_KEY is set, add
:<PROXY_KEY>
after the OpenAI key. If not set, no modification is required.
- Only GET and POST method interfaces are supported, and file-related interfaces are not supported.
SSE is currently not supported, so stream-related options need to be turned off.SSE is OK now.
Take https://www.npmjs.com/package/chatgpt
as an example.
chatApi= new gpt.ChatGPTAPI({
apiKey: 'sk.....:<proxy_key here>',
apiBaseUrl: "http://localhost:9001", // Pass the proxy address
});
- SSE referenced the related code in chatgpt-api project.