The Stable Diffusion WebUI container requires GPU access. To grant that in a
container you must install nvidia-container-runtime
.
curl -s -L https://nvidia.github.io/nvidia-container-runtime/gpgkey | sudo apt-key add -
curl -s -L https://nvidia.github.io/nvidia-container-runtime/$(. /etc/os-release;echo $ID$VERSION_ID)/nvidia-container-runtime.list | sudo tee /etc/apt/sources.list.d/nvidia-container-runtime.list
sudo apt-get update
sudo apt-get install nvidia-container-runtime
Then reboot the Docker service.
When running this container as its own unit, build and execute it with:
docker build --target sd_standalone -t sd_web_ui -f sd-web-ui.dockerfile .
docker run --gpus all -P sd_web_ui
This will download all the large files in the Docker image.
The compose configuration for sd_web_ui
mounts the path /var/sd_web_ui
to
the container and configures the service to fetch the large files from there.
You will need to manually create the directores models/Stable-diffusion
,
embeddings
, and outputs
in that directory. Then download the Stable
Diffusion checkpoint into the models/Stable-diffusion
directory.
wget -qO- https://huggingface.co/CompVis/stable-diffusion-v-1-4-original/resolve/main/sd-v1-4.ckpt > /var/sd_web_ui/models/Stable-diffusion/sd-v1-4.ckpt
The container may then be spun up by using docker compose
to build
and up
the sd_web_ui
service. The container will expose port 7860
on the host.
sudo docker compose build sd_web_ui && sudo docker compose up sd_web_ui
Read the information about configuring host storage above, and then create a
.env
file with SHITBOT_HOST
set to the host:port
combo for the shit bot
machine. Then run docker compose up
to start both the SD and Sidecar services.