cnafan's Stars
f/awesome-chatgpt-prompts
This repo includes ChatGPT prompt curation to use ChatGPT better.
lobehub/lobe-chat
🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
vercel/vercel
Develop. Preview. Ship.
cpacker/MemGPT
Create LLM agents with long-term memory and custom tools 📚🦙
abetlen/llama-cpp-python
Python bindings for llama.cpp
huggingface/chat-ui
Open source codebase powering the HuggingChat app
PaddlePaddle/Paddle-Lite
PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)
HerbertHe/iptv-sources
Autoupdate iptv sources
dagrejs/dagre
Directed graph layout for JavaScript
microsoft/LLMLingua
To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.
alibaba/butterfly
🦋Butterfly,A JavaScript/React/Vue2 Diagramming library which concentrate on flow layout field. (基于JavaScript/React/Vue2的流程图组件)
varunshenoy/GraphGPT
Extrapolating knowledge graphs from unstructured text using GPT-3 🕵️♂️
LLM-Red-Team/kimi-free-api
🚀 KIMI AI 长文本大模型逆向API白嫖测试【特长:长文本解读整理】,支持高速流式输出、智能体对话、联网搜索、长文档解读、图像OCR、多轮对话,零配置部署,多路token支持,自动清理会话痕迹。
rahulnyk/knowledge_graph
Convert any text to a graph of knowledge. This can be used for Graph Augmented Generation or Knowledge Graph based QnA
abertsch72/unlimiformer
Public repo for the NeurIPS 2023 paper "Unlimiformer: Long-Range Transformers with Unlimited Length Input"
amosjyng/langchain-visualizer
Visualization and debugging tool for LangChain workflows
metauto-ai/GPTSwarm
🐝 GPTSwarm: LLM agents as (Optimizable) Graphs
yhLeeee/Awesome-LLMs-in-Graph-tasks
A curated collection of research papers exploring the utilization of LLMs for graph-related tasks.
thunlp/InfLLM
The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
liyucheng09/Selective_Context
Compress your input to ChatGPT or other LLMs, to let them process 2x more content and save 40% memory and GPU time.
princeton-nlp/AutoCompressors
[EMNLP 2023] Adapting Language Models to Compress Long Contexts
3DAgentWorld/Toolkit-for-Prompt-Compression
Toolkit for Prompt Compression
Open-Swarm-Net/GPT-Swarm
GPT-Swarm is an open-source project that harnesses the power of swarm intelligence to enhance the capabilities of state-of-the-art language models. By leveraging collective problem-solving and distributed decision-making, GPT-Swarm creates a robust, adaptive, and scalable framework for tackling complex tasks across various domains.
google-research-datasets/KELM-corpus
zhpmatrix/PaperReading
每天阅读过的论文的简要笔记
getao/icae
The repo for In-context Autoencoder
coolbeevip/langchain_plantuml
read-agent/read-agent.github.io
naver-ai/carecall-memory
Keep Me Updated! Memory Management in Long-term Conversations (Findings of EMNLP 2022)
DRSY/KV_Compression
[EMNLP 2023]Context Compression for Auto-regressive Transformers with Sentinel Tokens