p1atdev/LECO

RTX3080 - MultiGPU?

Moonlight63 opened this issue · 1 comments

First, I love this project. I have seen what others are making with it and it seems really powerful for fine tuning exactly the prompt you want.
The only way I have been able to get this to run locally on my 3080 10G is by lowering the resolution in the prompt to 352, 1 batch. It looks like others have gotten it to work, but is everyone just using colab? I'd like to run it locally. I have 2 3080s so theoretically I should have enough vram between them, but it looks like this isn't set up to train on both. I tried to add this myself with using torches DataParallel, but unfortunately I have no idea what I am doing when it comes to coding ML stuff yet. Any chance of getting multigpu to work or any advise on lowering vram usage?

erocle commented

I don't code at all for this type of stuff so can't help you but I have a single 8GB 3070 and train up to 768x locally just fine?