When using ComfyUI, most of the models we use are open-source models. Some closed-source models with excellent results cannot be used in ComfyUI. To solve this problem, we developed the Comflowy extension. We hope to integrate these high-quality closed-source models into ComfyUI, so that users can chain together various closed-source models through ComfyUI.
- Comflowy LLM Node: This is a node that calls LLM. You can use it to implement functions similar to Prompt Generator. Unlike other LLM nodes on the market, it obtains results by calling APIs, which means you don't need to install Ollama to call LLM models. No need to worry about whether your computer configuration is sufficient to run these LLM models. It's also free.
-
You can use our online version of Comflowy to run workflows containing this node.
-
You can also download the workflow file and import it into ComfyUI for use.
-
- Comflowy Omost Node: The Omost extension is a extension that helps you write prompts, but running this extension locally requires a computer with higher configuration. Based on our understanding of Omost, we implemented a similar node, but slightly different in that we didn't run Omost's official model, but implemented it through Prompt Engineering. This way, the running speed will be faster.
-
Online version workflow.
-
Local version workflow file.
-
- Comflowy Flux Node: Flux is a node that can generate images with Flux Pro. Flux Pro is a non open source model, so in most cases, you cannot use this model in ComfyUI. To solve this problem, we developed this node, which allows you to generate images directly in ComfyUI. But please note that this model is a commercial model, so each use will deduct your credits.
-
Online version App.
-
- Comflowy Ideogram Node: Ideogram is a node that can generate images with Ideogram. Similar to Flux, Ideogram is a non open source model, so in most cases, you cannot use this model in ComfyUI. To solve this problem, we developed this node, which allows you to generate images directly in ComfyUI. But please note that this model is a commercial model, so each use will deduct your credits.
-
Online version App.
-
- Comflowy Clarity Upscale Node: This is a node that can upscale images. This node is claimed to be a replacement for Magnific. The overseas influencer developer levlsio praised this model.
-
Online version App.
-
Node | Price |
---|---|
LLM | Free |
Omost | Free |
Flux | Flux-1.1-pro costs approximately 400 credit per image. Flux-pro costs 550 credits per image. |
Ideogram | Ideogram-v2-turbo costs approximately 800 credit per image. Ideogram-v2 costs 500 credits per image. |
Clarity Upscale | This model costs approximately 500 credits per image, but this varies depending on your inputs. |
Note
It should be noted that when using the Comflowy extension, there may be situations where it cannot be used normally due to network problems. If you encounter an error like Failed to get response from LLM model with https://app.comflowy.com/api/open/v0/prompt
, you need to check your network status.
Step 1: Install Comflowy ComfyUI Extension
-
Method 1: Install using ComfyUI Manager (recommended)
-
Method 2: Git installation
Open a cmd window in the CompyUI extension directory (e.g., "CompyUI\custom_nodes") and type the following command:
git clone https://github.com/6174/comflowy-nodes.git
-
Method 3: Download zip file
Or download and unzip the zip file, copy the resulting folder to the
ComfyUI\custom_nodes\
directory.
Step 2: Obtain Comflowy API Key
Next, you need to obtain the Comflowy API Key. Click on the avatar in the bottom left corner (Figure ①), then click on Settings (Figure ②), and finally find the API Key (Figure ③) and copy it. Note: For security reasons in future use, please do not disclose your API Key to others.
Step 3: Enter Comflowy API Key
Lastly, you need to input the API Key into the Comflowy Set API Key node. After entering it, you can delete this node. Then you can use other Comflowy nodes. If you don't input this node, you won't be able to use Comflowy nodes.
- V0.2: Added Flux node, Ideogram node.
- V0.1: Support for LLM node, Omost node, Http node.
- Thanks to SiliconFlow for providing free LLM services.
- Thanks to the author of Omost and the author of the ComfyUI-Omost extension.
- Thanks to all who contributed to this open source project: