/langchain-streamlit-demo

langchain-streamlit demo with streaming llm, memory, and langsmith feedback

Primary LanguagePythonMIT LicenseMIT

title emoji colorFrom colorTo sdk app_port pinned tags
langchain-streamlit-demo
🦜
green
red
docker
7860
true
langchain
streamlit
docker

langchain-streamlit-demo

License: MIT python

Push to Docker Hub Docker Image Size (tag)

Push to HuggingFace Space Open HuggingFace Space

Code Climate maintainability Code Climate issues Code Climate technical debt

pre-commit Ruff Checked with mypy Code style: black

security: bandit Known Vulnerabilities

Update AI Changelog on Push to Main

This project shows how to build a simple chatbot UI with Streamlit and LangChain.

This README was originally written by Claude 2, an LLM from Anthropic.

Features

  • Chat interface for talking to AI assistant
  • Supports models from
    • OpenAI
      • gpt-3.5-turbo
      • gpt-4
    • Anthropic
      • claude-instant-v1
      • claude-2
    • Anyscale Endpoints
      • meta-llama/Llama-2-7b-chat-hf
      • meta-llama/Llama-2-13b-chat-hf
      • meta-llama/Llama-2-70b-chat-hf
      • codellama/CodeLlama-34b-Instruct-hf
      • mistralai/Mistral-7B-Instruct-v0.1
    • Azure OpenAI Service
      • [configurable]
  • Streaming output of assistant responses
  • Leverages LangChain for dialogue and memory management
  • Integrates with LangSmith for tracing conversations
  • Allows giving feedback on assistant's responses
  • Tries reading API keys and default values from environment variables
  • Parameters in sidebar can be customized
  • Includes various forms of document chat
    • Question/Answer Pair Generation
    • Summarization
    • Standard retrieval chains

Deployment

langchain-streamlit-demo is deployed as a Docker image based on the python:3.11-slim-bookworm image. CI/CD workflows in .github/workflows handle building and publishing the image as well as pushing it to Hugging Face.

Run on HuggingFace Spaces

Open HuggingFace Space

With Docker (pull from Docker Hub)

  1. Optional: Create a .env file based on .env-example
  2. Run in terminal:

docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest

or

docker run -p 7860:7860 --env-file .env joshuasundance/langchain-streamlit-demo:latest

  1. Open http://localhost:7860 in your browser

Docker Compose (build locally)

  1. Clone the repo. Navigate to cloned repo directory
  2. Optional: Create a .env file based on .env-example
  3. Run in terminal:

docker compose up

  1. Open http://localhost:7860 in your browser

Kubernetes

  1. Clone the repo. Navigate to cloned repo directory
  2. Create a .env file based on .env-example
  3. Run bash script: /bin/bash ./kubernetes/deploy.sh
  4. Get the IP address for your new service: kubectl get service langchain-streamlit-demo

Links