/ollama-monitor

Simple system tray application to monitor the status of your LLM models running on Ollama

Primary LanguagePython

Ollama Monitor

Python 3.8+ Windows License: MIT

A lightweight system tray application to monitor Ollama AI models with real-time status updates.

FeaturesInstallationUsageContributingLicense

✨ Features

  • 🔄 Real-time monitoring of Ollama model status
  • 🔔 Clean and minimal notifications
  • 🚀 Windows startup configuration
  • ⚙️ Customizable API connection settings
  • 🌐 Comprehensive proxy support with authentication
  • 🎯 Color-coded status indicators:
    • 🟢 Green: Model active and running
    • 🔵 Blue: No model running
    • 🔴 Red: Ollama service not running

📋 Requirements

  • 💻 Windows 10/11
  • 🐍 Python 3.8+
  • 🤖 Ollama installed and configured

🚀 Installation

You need to have Ollama installed and configured 😊

Option 1: Download Executable

  1. Go to Releases
  2. Download latest OllamaMonitor.exe
  3. Run the executable

Option 2: Run from Source

  1. Clone the repository
git clone https://github.com/ysfemreAlbyrk/ollama-monitor.git
  1. Install dependencies
pip install -r requirements.txt
  1. Run directly
python ollama_monitor.py

Option 3: Build from Source

  1. Clone the repository and install dependencies (steps 1-2 above)

  2. Build executable

python build.py

The executable will be created in the dist directory.

📖 Usage

  1. Make sure Ollama is installed and running on your system
  2. Launch OllamaMonitor
  3. The app will appear in your system tray with a color-coded status icon:
    • 🟢 Green: Model active and running
    • 🔵 Blue: No model loaded
    • 🔴 Red: Ollama not running
  4. Right-click the tray icon to:
    • View current model status
    • Open settings
    • Exit the application

I can't see the system tray icon 🤔

If the system tray icon is not visible, you may need to enable it in Windows settings:

  1. Press right click on the taskbar and select Taskbar Settings
  2. Click on Select which icons appear on the taskbar
  3. Find OllamaMonitor.exe and make sure it's turned on

Settings

To configure the application:

  1. Right-click the tray icon
  2. Select "Settings"
  3. You can customize:
    • API URL (supports proxy with authentication)
    • Run at Windows startup
    • Start with Windows option

Example URLs:

  • Direct connection: http://localhost:11434
  • With proxy: http://proxy.example.com:8080
  • With proxy auth: http://username:password@proxy.example.com:8080

Logs

Application logs are stored in:

%APPDATA%/OllamaMonitor/logs/ollama_monitor_YYYYMMDD.log

Please send logs to GitHub if you encounter any issues.

🔧 Development

Version Management

The version information is centrally managed in __version__.py. To update the version:

  1. Update the version in __version__.py
  2. Run python build.py to rebuild the executable

📜 Changelog

See CHANGELOG.md for a list of changes.

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📜 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • 🚀 Ollama for the amazing AI model runtime
  • 🖥️ pystray for the system tray implementation

Made with ❤️ by Yusuf Emre ALBAYRAK