DDPM inference anormal size
YoannRandon opened this issue · 2 comments
YoannRandon commented
Hello, i trained a palette model using the tutorials in "joligen.com/doc/".
The training is now finished and i want to perform inference to look at my models performance ("visdom" didn't update with "nohup" command, I don't know why)
So i used the command describ in the tutorial :
but i end up with the follow error :
I think 29Go is abnormally big, any thought about why the required memory is this high?
YoannRandon commented
beniz commented
Hi, try with a smaller image to start with. Inference does not use amp
/tf32
and/or torch.compile
at the moment (training does with option --with_amp --with_tf32
), but it could be added, saves half the memory in practice.