/Merge-Stable-Diffusion-models-without-distortion

Adaptation of the merging method described in the paper - Git Re-Basin: Merging Models modulo Permutation Symmetries (https://arxiv.org/abs/2209.04836) for Stable Diffusion

Primary LanguagePythonMIT LicenseMIT

Merge-Stable-Diffusion-models-without-distortion

I wrote the permutation spec for Stable Diffusion necessary to merge with the git-re-basin method outlined here - https://github.com/samuela/git-re-basin. This is based on a 3rd-party pytorch implementation of that here - https://github.com/themrzmaster/git-re-basin-pytorch.

To merge, you may need to install pytorch 1.11.0 or lower (at some point, 1.12.0 did not work but the latest versions of pytorch may have resolved the issue).

Download the code folder, open cmd in the directory, transfer the desired models to the same folder and run "python SD_rebasin_merge.py --model_a nameofmodela.ckpt --model_b nameofmodelb.ckpt"

If not in the same directory then pathofmodela.ckpt and pathofmodelb.ckpt instead

Notes for SDXL by DammK

python SD_rebasin_merge.py --model_a _211-Replicant-V3.0_fp16.safetensors --model_b _220-realgarV20V21V21Yuri_v21.safetensors
  • SDXL will takes hours to merge! 6 minutes per permutation! Default model name will be merged.safetensors.
weight_matching in fp32:  33%|██████████████████▎                                    | 1/3 [12:07<24:15, 727.52s/it] 
Applying weighted_sum to special_keys: 100%|████████████████████████████████████████| 6/6 [00:00<00:00, 6009.03it/s] 
Main loop: 100%|████████████████████████████████████████████████████████████████| 10/10 [3:47:06<00:00, 1362.64s/it]

Saving...
Done!
  • The final result (actually you can derive from paper) is based from averaging i.e. $(A*0.5+B*0.5)$. However similar to TIES and AutoMBW, it looks better from the plain averaging.

  • Both comparasion are "avg / TIES-SOUP / avg(avg+TIES-SOUP) / rebasin(avg+TIES-SOUP)"

xyz_grid-0841-740330577-8064-1623-3-48-20240428123657.jpg xyz_grid-0842-740330577-8064-1623-3-48-20240428125432.jpg