/comfyui-ollama-chat

fork and add chat nodes ( one round chat)

Primary LanguagePythonApache License 2.0Apache-2.0

ComfyUI Ollama

Custom ComfyUI Nodes for interacting with Ollama using the ollama python client.

Integrate the power of LLMs into ComfyUI workflows easily or just experiment with GPT.

To use this properly, you would need a running Ollama server reachable from the host that is running ComfyUI.

Installation

  1. Install ComfyUI
  2. git clone in the custom_nodes folder inside your ComfyUI installation or download as zip and unzip the contents to custom_nodes/compfyui-ollama.
  3. Start/restart ComfyUI

Or

use the compfyui manager "install via git url".

pic

Nodes

OllamaVision

A node that gives an ability to query input images.

pic

A model name should be model with Vision abilities, for example: https://ollama.com/library/llava.

OllamaGenerate

A node that gives an ability to query an LLM via given prompt.

pic

OllamaGenerateAdvance

A node that gives an ability to query an LLM via given prompt with fine tune parameters and an ability to preserve context for generate chaining.

Check ollama api docs to get info on the parameters.

More params info

pic

Usage Example

Consider the following workflow of vision an image, and perform additional text processing with desired LLM. In the OllamaGenerate node set the prompt as input.

pic

The custom Text Nodes in the examples can be found here: https://github.com/pythongosssss/ComfyUI-Custom-Scripts