TRAINING LORA please give more info
Opened this issue · 1 comments
Maelstrom2014 commented
HI, thanx for excelent solution and resolution!
Plase give more information about training LORA:
- How much VRAM I need?
- How long it takes to train? epochs, how many pics I need?
- What resolution/aspects, of pics I need in dataset?
catcathh commented
Thanks for your interest.
- When training a LoRA , the VRAM usage is highly dependent on the image resolution. For instance, with an image resolution of 4096x4096, the VRAM requirement can reach approximately 60 GB on A100.
- For the cat example I provided, training took around 5,000 iterations.To optimize the process, you may first train the model at a lower base resolution (such as 1024x1024) and then fine-tune it on higher resolutions.
- Using ~ 10 images with varying resolutions and aspect ratios should be sufficient for effective training. I provide data used for lora at https://github.com/catcathh/UltraPixel/tree/main/figures/example_lora_cat.