fastchat
There are 16 repositories under fastchat topic.
chatchat-space/Langchain-Chatchat
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
smalltong02/keras-llm-robot
A web UI Project In order to learn the large language model. This project includes features such as chat, quantization, fine-tuning, prompt engineering templates, and multimodality.
shell-nlp/gpt_server
gpt_server是一个用于生产级部署LLMs、Embedding、Reranker、ASR、TTS、文生图、图片编辑和文生视频的开源框架。
feiyun0112/Local-LLM-Server
quick way to build a private large language model server and provide OpenAI-compatible interfaces | 快速搭建私有大语言模型(LLM)服务,提供OpenAI兼容接口
Qiyuan-Ge/OpenAssistant
LLM as Agent
ivangabriele/docker-fastchat
[Work In Progress] Server/Cloud-ready FastChat Docker images.
AliaXueting/fastchat-Vicuna-Langchain-Modif_KnowledgeBase
fastchat/Integrate Langchain/Create Private Knowledge Base
Kurtyjlee/Complex-QA
Complex Question Answering - Evaluating Vicuna-13b's biases against different domians
Shristirajpoot/Chatgpt-pro
Chatgpt-pro is an open-source platform for training, serving, and evaluating large language model (LLM) chatbots.
ynotopec/llm-k8s
llm helm
SDSU-Research-CI/fastchat-demo
This is a demo for using the Fastchat and OpenAI API for LLM summarization
Zsbyqx20/VicunaTalk
A speech-to-speech talking bot (in development)
aakashkavuru101/LLM-Testing-app-v1
using LMarena.ai base foundation fastchat and building another version of it, where you can locally test LLMs through it. If it fails, connect Ollama and test.
Noora-Alhajeri/bahar
Bahar is a chatbot web application powered by JAIS, an Arabic-centric Large Language Model developed by MBZUAI and Inception AI. This project, developed during the Emirati AI Boot Camp 2024, showcases the integration of LLMs into real-world applications, featuring a user-friendly interface and the ability to generate text in both Arabic & English.