hordelib
is a wrapper around ComfyUI primarily to enable the AI Horde to run inference pipelines designed visually in the ComfyUI GUI.
The developers of hordelib
can be found in the AI Horde Discord server: https://discord.gg/3DxrhksKzn
hordelib
has been the default inference backend library of the AI Horde since hordelib
v1.0.0.
The goal here is to be able to design inference pipelines in the excellent ComfyUI, and then call those inference pipelines programmatically. Whilst providing features that maintain compatibility with the existing horde implementation.
If being installed from pypi, use a requirements file of the form:
--extra-index-url https://download.pytorch.org/whl/cu118
hordelib
...your other dependencies...
On Linux you will need to install the Nvidia CUDA Toolkit. Linux installers are provided by Nvidia at https://developer.nvidia.com/cuda-downloads
Note if you only have 16GB of RAM and a default /tmp on tmpfs, you will likely need to increase the size of your temporary space to install the CUDA Toolkit or it may fail to extract the archive. One way to do that is just before installing the CUDA Toolkit:
sudo mount -o remount,size=16G /tmp
If you only have 16GB of RAM you will also need swap space. So if you typically run without swap, add some. You won't be able to run this library without it.
Horde payloads can be processed simply with (for example):
import os
import hordelib
hordelib.initialise()
from hordelib.horde import HordeLib
from hordelib.shared_model_manager import SharedModelManager
# Wherever your models are
os.environ["AIWORKER_CACHE_HOME"] = "f:/ai/models"
generate = HordeLib()
SharedModelManager.loadModelManagers(compvis=True)
SharedModelManager.manager.load("Deliberate")
data = {
"sampler_name": "k_dpmpp_2m",
"cfg_scale": 7.5,
"denoising_strength": 1.0,
"seed": 123456789,
"height": 512,
"width": 512,
"karras": True,
"tiling": False,
"hires_fix": False,
"clip_skip": 1,
"control_type": None,
"image_is_control": False,
"return_control_map": False,
"prompt": "an ancient llamia monster",
"ddim_steps": 25,
"n_iter": 1,
"model": "Deliberate",
}
pil_image = generate.basic_inference(data)
pil_image.save("test.png")
Note that hordelib.initialise()
will erase all command line arguments from argv. So make sure you parse them before you call that.
See tests/run_*.py
for more standalone examples.
If you don't want hordelib
to setup and control the logging configuration initialise with:
import hordelib
hordelib.initialise(setup_logging=False)
hordelib
depends on a large number of open source projects, and most of these dependencies are automatically downloaded and installed when you install hordelib
. Due to the nature and purpose of hordelib
some dependencies are bundled directly inside hordelib
itself.
A powerful and modular stable diffusion GUI with a graph/nodes interface. Licensed under the terms of the GNU General Public License v3.0.
The entire purpose of hordelib
is to access the power of ComfyUI.
Custom nodes for ComfyUI providing Controlnet preprocessing capability. Licened under the terms of the Apache License 2.0.
Custom nodes for ComfyUI providing face restoration.
Requirements:
git
(install git)tox
(pip install tox
)- Set the environmental variable
AIWORKER_CACHE_HOME
to point to a model directory.
Note the model directory must currently be in the original AI Horde directory structure:
<AIWORKER_CACHE_HOME>\
nataili\
clip\
codeformer\
compvis\
Deliberate.ckpt
...etc...
controlnet\
embeds\
esrgan\
gfpgan\
safety_checker\
Simply execute: tox
(or tox -q
for less noisy output)
This will take a while the first time as it installs all the dependencies.
If the tests run successfully images will be produced in the images/
folder.
tox -- -k <filename>
for example tox -- -k test_initialisation
tox list
This will list all groups of tests which are involved in either the development, build or CI proccess. Tests which have the word 'fix' in them will automatically apply changes when run, such as to linting or formatting. You can do this by running:
tox -e [test_suite_name_here]
hordelib/pipeline_designs/
Contains ComfyUI pipelines in a format that can be opened by the ComfyUI web app. These saved directly from the web app.
hordelib/pipelines/
Contains the above pipeline JSON files converted to the format required by the backend pipeline processor. These are converted from the web app, see Converting ComfyUI pipelines below.
hordelib/nodes/
These are the custom ComfyUI nodes we use for hordelib
specific processing.
In this example we install the dependencies in the OS default environment. When using the git version of hordelib
, from the project root:
pip install -r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu118 --upgrade
Ensure ComfyUI is installed and patched, one way is running the tests:
tox
From then on to run ComfyUI:
cd ComfyUI
python main.py
Then open a browser at: http://127.0.0.1:8188/
Use the standard ComfyUI web app. Use the "title" attribute to name the nodes, these names become parameter names in the hordelib
. For example, a KSampler with the "title" of "sampler2" would become a parameter sampler2.seed
, sampler2.cfg
, etc. Load the pipeline hordelib/pipeline_designs/pipeline_stable_diffusion.json
in the ComfyUI web app for an example.
Save any new pipeline in hordelib/pipeline_designs
using the naming convention "pipeline_<name>.json".
Convert the JSON for the model (see Converting ComfyUI pipelines below) and save the resulting JSON in hordelib/pipelines
using the same filename as the previous JSON file.
That is all. This can then be called from hordelib
using the run_image_pipeline()
method in hordelib.comfy.Comfy()
In addition to the design file saved from the UI, we need to save the pipeline file in the backend format. This file is created in the hordelib
project root named comfy-prompt.json
automatically if you run a pipeline through the ComfyUI version embedded in hordelib
. Running ComfyUI with tox -e comfyui
automatically patches ComfyUI so this JSON file is saved.
The main config files for the project are: pyproject.toml
, tox.ini
and requirements.txt
Pypi publishing is automatic all from the GitHub website.
- Create a PR from
main
toreleases
- Label the PR with "release:patch" (0.0.1) or "release:minor" (0.1.0)
- Merge the PR with a standard merge commit (not squash)
Here's an example:
Start in a new empty directory. Create requirements.txt:
--extra-index-url https://download.pytorch.org/whl/cu118
hordelib
Create the directory images/
and copy the test_db0.jpg
into it.
Copy run_controlnet.py
from the hordelib/tests/
directory.
Build a venv:
python -m venv venv
.\venv\Scripts\activate
pip install -r requirements.txt
Run the test we copied:
python run_controlnet.py
The `images/` directory should have our test images.
We use a ComfyUI version pinned to a specific commit, see hordelib/consts.py:COMFYUI_VERSION
To test if the latest version works and upgrade to it, from the project root simply:
cd ComfyUI
Change CWD to the embedded comfygit checkout master
Switch to master branchgit pull
Get the latest comfyui codegit rev-parse HEAD
Update the hash inhordelib.consts:COMFYUI_VERSION
cd ..
Get back to the hordelib project roottox
See if everything still works
Now ComfyUI is pinned to a new version.
We patch the ComfyUI source code to:
- Modify the model manager to allow us to dynamically move models between VRAM, RAM and disk cache.
- Allow make ComfyUI output some handy JSON we need for development purposes.
To create a patch file:
- Make the required changes to a clean install of ComfyUI and then run
git diff > yourfile.patch
then move the patch file to wherever you want to save it.
Note that the patch file really needs to be in UTF-8 format and some common terminals, e.g. Powershell, won't do this by default. In Powershell to create a patch file use: git diff | Set-Content -Encoding utf8 -Path yourfile.patch
Patches can be applied with the hordelib.install_comfyui.Installer
classes apply_patch()
method.