hako-mikan/sd-webui-lora-block-weight

Optimizing LoRA Configurations: Exploring Block Pruning for Enhanced Learning

Closed this issue · 0 comments

Hey everyone, I've been experimenting with various LoRA configurations ever since I learned about this extension. I've observed that in some cases, just 1 out of 17 blocks can activate a feature that LoRA is supposed to enable. This is quite peculiar. Does this imply that all the other blocks are essentially just noise and could be removed? Is there a way to enhance the learning process to include such pruning?