A Streamlit web app that generates interactive knowledge graphs from plain text using Ollama AI models.
You can upload a .txt
file or paste text, and the app will create a graph where concepts are visualized as connected nodes.
- Two input methods: Upload
.txt
file or paste text directly. - Ollama model integration: Select from available local models (e.g., Gemma, Mistral, LLaMA).
- Automatic graph storage: Generated graphs are saved and can be reloaded anytime.
- Interactive visualization: Zoom, drag, and explore relationships between concepts.
- Optimized for speed: Uses hashed filenames to prevent regenerating the same graph.
.
├── app.py # Main Streamlit app
├── Data # Folder containing .html files.
├── n_sc.png # Image file (doc)
├── README.md # Project documentation and instructions(doc)
├── requirements.txt # List of Python dependencies for the project(libraries)
├── sc.png # Another image file (doc)
└── src
├── config
│ └── folder_con.py # Configuration for folder creation
├── graph
│ ├── generate_kgraph.py # Script to generate knowledge graphs
│ └── visulization.py # Script to visualize knowledge graphs (PyVis)
├── model
│ └── model_info.py # information about installed Ollama models/context len
└── utils
├── file_op.py # Utility functions for file operations (read/write)
└── text_clean.py # Utility functions for text preprocessing/cleaning
- Clone the repository
git clone https://github.com/ganeshnikhil/Kgraph.git
cd Kgraph
- Create a virtual environment
python -m venv venv
source venv/bin/activate # On Linux/Mac
venv\Scripts\activate # On Windows
- Install dependencies
pip install -r requirements.txt
-
Install Ollama (if not already installed) Follow instructions from: https://ollama.ai/download
-
Start Ollama server
ollama serve
- Download a model (example with Gemma)
ollama pull gemma3:4b
- Run the Streamlit app
streamlit run app.py
- Clone the repository
git clone https://github.com/ganeshnikhil/Kgraph.git
cd Kgraph
- Build Docker images
docker-compose build
- Start services (Ollama + Streamlit)
docker-compose up -d
- Pull Ollama models inside the container
docker exec -it ollama ollama pull <model-name>
- Access the Streamlit app Open your browser and go to: http://localhost:8501
- Upload a
.txt
file or paste your text in the sidebar. - Select an Ollama model to use for generating the graph.
- Click "Generate Knowledge Graph".
- View, zoom, and explore the interactive visualization.
- Load previously generated graphs from the sidebar.
- All generated graphs are saved in the
Data/
directory as.html
files. - Large input text may take more time to process depending on the model.
- If no model is found, the app will fall back to Gemma.
This project is licensed under the MIT License — see the LICENSE file for details.