Flask-based Chatbot
This repository includes a simple Python Flask app that streams responses from OpenAI to an HTML/JS frontend using NDJSON over a ReadableStream.
The repository is designed for use with Docker containers, both for local development and deployment. 🐳
Start the app without docker
-
Using conda to create a new environment
conda create -n myenv python=3.11
-
Install required packages
pip install -r requirements-dev.txt
-
Start the flask app
gunicorn app:app
If you are starting it in a production environment, please use this:
RUNNING_IN_PRODUCTION=1 gunicorn app:app
Then use Ctrl-Z and the
bg && disown
to move it to background. -
Click 'http://0.0.0.0:50505' in the terminal, which should open a new tab in the browser. You may need to navigate to 'http://localhost:50505' if that URL doesn't work.
Start the app with docker
In addition to the Dockerfile
that's used in production, this repo includes a docker-compose.yaml
for
local development which creates a volume for the app code. That allows you to make changes to the code
and see them instantly.
-
Install Docker Desktop. If you opened this inside Github Codespaces or a Dev Container in VS Code, installation is not needed.
⚠️ If you're on an Apple M1/M2, you won't be able to rundocker
commands inside a Dev Container; either use Codespaces or do not open the Dev Container. -
Make sure that the
.env
file exists and the correct OPENAI_API_KEY is set. -
Start the services with this command:
docker compose up --build
-
Click 'http://0.0.0.0:50505' in the terminal, which should open a new tab in the browser. You may need to navigate to 'http://localhost:50505' if that URL doesn't work.