/flaskchat

flask app using restapi and master script to store conversations simply across multiple runs for prompting purposes

Primary LanguagePythonMIT LicenseMIT

flaskchat

This is flaskchat, which uses multiple LLM models to write stories, or have philosophical discussions, or anything else you want to set the prompts for.

I use this to figure out strategies for multiple model, chain of thought, tree of thought, or alternate reasoning models.

Table of Contents

Features

Two parts, flaskapi_gpt.py and main.py (or the variations that use different backend storage.)

Uses openai and cohere as these are some good models with reasonable api prices.

Getting Started

python main.py # for list storage
python main-mongoback.py # for mongodb or ferretdb backend
python main-redisback.py # for redis backend

In separate window:

python flaskapi_gpt.py # for main application 

Prerequisites

  • python >= 3.9
  • cohere and openai api keys
  • cohere (python client)
  • openai (python client)
  • redis server as available
  • mongodb server as available

Installation and

Usage

  1. Clone the repository.
   git clone https://github.com/jtatman/flaskchat.git
  1. Create a virtual environment

Use micromamba, why not?

  "${SHELL}" <(curl -L micro.mamba.pm/install.sh)
  micromamba create -n flaskchat python=3.9 

or venv

  python -m venv .venv
  .venv/bin/activate
  1. Install requirements
  pip install -r requirements.txt
  1. Assumes mongo and redis run in servers at localhost or docker. If not, adjust in respective files.

  2. Docker or podman

  • To run mongo version:
  docker compose -f docker-compose-mongo.yml build # build 
  docker compose -f docker-compose-mongo.yml up    # bring up in screen or a spare terminal
  docker compose -f docker-compose-mongo.yml -d    # bring up in daemon mode
  • To run redis version:
  docker compose -f docker-compose-redis.yml build  # build
  docker compose -f docker-compose-redis.yml up     # bring up in screen or a spare terminal
  docker compose -f docker-compose-redis.yml -d     # bring up in daemon mode 
  • Podman:
  pip3 install podman-compose
  podman-compose up -f docker-compose-mongo.yml -d
  # or
  podman-compose up -f docker-compose-redis.yml -d
  • Kubernetes:

Use Kompose to convert the compose yamls and proceed as needed. Use k0s or k3s for prototyping. Export pods to whereever. Have fun.

License

MIT