This project is a simple Docker Nginx project that serves as a OpenAI API as a proxy Nginx here is preconfigured to work on OpenAI API.
- Works with any client that allows you to configure the server address (as it acts as a reverse proxy)
- Docker
- Docker compose
- Clone the repository:
git clone https://github.com/phuongdo/openai-proxy.git
cd openai-pr
- Start the container:
docker-compose up -d
- Follow the logs
docker-compose logs -f
- Stop the container:
docker-compose down
Set your client's API server address to http://localhost:9090
curl http://localhost:9090/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
The cache is configured using the nginx.conf
. You can modify this file to change the cache settings or add additional URIs.
Contributions are welcome! Please submit a pull request or open an issue if you encounter any problems or have suggestions for improvements.