/graphrag-server

添加🚀流式 Web 服务到 GraphRAG,兼容 OpenAI SDK,支持可访问的实体链接🔗,支持建议问题,兼容本地嵌入模型,修复诸多问题。Add streaming web server to GraphRAG, compatible with OpenAI SDK, support accessible entity link, support advice question, compatible with local embedding model, fix lots of issues.

Primary LanguagePythonMIT LicenseMIT

GraphRAG customized by KylinMountain

  • I have added websever to support streaming output immediately.
  • I have fixed error when using local embedding service like LM Studio
  • I have fixed index error after prompt tune
  • I have fixed the strategy not loaded when setting entity extraction using NLTK.
  • I have added advice question api
  • I have added reference link to the entity、report or relationship refered in output, you can access it.
  • Support any desktop application or web application compatible with OpenAI SDK.
  • Support docker deploy. you can get the docker kylinmountain/graphrag-server:0.3.1

GraphRAG 定制版

  • 我添加了Web服务器,以支持真即时流式输出。
  • 我修复了使用本地嵌入服务(如LM Studio)时的错误。
  • 我修复了提示调整后索引错误的问题。
  • 我修复了在使用NLTK设置实体提取时策略未加载的问题。
  • 我添加了建议问题API。
  • 我添加了实体或者关系等链接到输出中,你可以直接点击访问参考实体、关系、数据源或者报告。
  • 支持任意兼容OpenAI大模型桌面应用或者Web应用UI接入。
  • 增加Docker构建,最新版本0.3.1, kylinmountain/graphrag-server:0.3.1

image

image

如何安装How to install

你可以使用Docker安装,也可以拉取本项目使用。You can install by docker or pull this repo.

拉取源码 Pull the source code

  • 克隆本项目 Clone the repo
git clone https://github.com/KylinMountain/graphrag.git
cd graphrag
  • 建立虚拟环境 Create virtual env
conda create -n graphrag python=3.10
conda activate graphrag
  • 安装poetry Install poetry
curl -sSL https://install.python-poetry.org | python3 -
  • 安装依赖 Install dependencies
poetry install
pip install -r webserver/requirements.txt

或者 or

pip install -r requirements.txt
  • 初始化GraphRAG Initialize GraphRAG
poetry run poe index --init --root .
# 或者 or 
python -m graphrag.index --init --root . 
  • 创建input文件夹 Create Input Foler
mkdir input
  • 配置settings.yaml Config settings.yaml 按照GraphRAG官方配置文档配置 GraphRAG Configuration
  • 配置webserver Config webserver

你可能需要配置以下设置,但默认即可支持本地运行。 You may need config the following item, but you can use the default param.

    server_host: str = "http://localhost"
    server_port: int = 20213
    data: str = (
        "./output"
    )
    lancedb_uri: str = (
        "./lancedb"
    )
  • 启动web serevr
python -m webserver.main

更多的参考配置,可以访问公众号文章B站视频

使用Docker安装 Install by docker

  • 拉取镜像 pull the docker image
docker pull kylinmountain/graphrag-server:0.3.1

启动 Start 在启动前 你可以创建output、input、prompts和settings.yaml等目录或者文件 Before start, you can create output、input、prompts and settings.yaml etc.

docker run -v ./output:/app/output \
           -v ./input:/app/input \
           -v ./prompts:/app/prompts \
           -v ./settings.yaml:/app/settings.yaml \
           -v ./lancedb:/app/lancedb -p 20213:20213 kylinmountain/graphrag-server:0.3.1

  • 索引 Index
docker run kylinmountain/graphrag-server:0.3.1 python -m graphrag.index --root .

GraphRAG

👉 Use the GraphRAG Accelerator solution
👉 Microsoft Research Blog Post
👉 Read the docs
👉 GraphRAG Arxiv

PyPI - Version PyPI - Downloads GitHub Issues GitHub Discussions

Overview

The GraphRAG project is a data pipeline and transformation suite that is designed to extract meaningful, structured data from unstructured text using the power of LLMs.

To learn more about GraphRAG and how it can be used to enhance your LLM's ability to reason about your private data, please visit the Microsoft Research Blog Post.

Quickstart

To get started with the GraphRAG system we recommend trying the Solution Accelerator package. This provides a user-friendly end-to-end experience with Azure resources.

Repository Guidance

This repository presents a methodology for using knowledge graph memory structures to enhance LLM outputs. Please note that the provided code serves as a demonstration and is not an officially supported Microsoft offering.

⚠️ Warning: GraphRAG indexing can be an expensive operation, please read all of the documentation to understand the process and costs involved, and start small.

Diving Deeper

Prompt Tuning

Using GraphRAG with your data out of the box may not yield the best possible results. We strongly recommend to fine-tune your prompts following the Prompt Tuning Guide in our documentation.

Responsible AI FAQ

See RAI_TRANSPARENCY.md

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

Privacy

Microsoft Privacy Statement