kohya-ss/sd-scripts

Improvment to lora.py and flux lora.py

Opened this issue · 2 comments

used this paper to implement the basic methodology into the lora.py network https://github.com/DAMO-NLP-SG/Inf-CLIP

network dim 32 sdxl now maintains a speed of 2.7 sec/it at a batch size of 40 for less than 24gb on a 4090. my flux implementation needs some help. i managed to get a batch size of 3 with no split on dim 32. using adafactor for both. please take a look let me know if i can help in any way.
lora (2).txt
lora_flux.txt

im sure this can be more appropriately modified for flux, but i lack the expertise and familiarity of all your moving parts. let me know if theirs anything i can send over
image

trains at 60, need a bigger dataset .

From what I understand, the paper is about contrastive loss, so I don't think it can be used to train image generation models (uses MSE loss).