API-first frontend to Ollama (backend), based on FastAPI
Ollama must be installed and running on the destination host.
$ git clone https://github.com/carmelo0x63/FastLLMAPI.git
$ cd FastLLMAPI
$ python3 -m venv .
$ source bin/activate
$ python3 -m pip install --upgrade pip setuptools wheel
$ python3 -m pip install --upgrade fastapi Flask requests
In two separate terminal windows, run the following commands:
python3 main_app.py
python3 gui_app.py
The graphical interface will be available on the host where the applications are run, on port 5000:
`http://<ip_address>:5000/`