Model 2 subbatch_size not working
smilenaderi opened this issue · 3 comments
smilenaderi commented
Hi I'm trying to inference a sequence with length 548 on an A10 GPU with model 2
I tried subbatch size equal to 1 but GRAM is low and it can not be runned.
wewewexiao2008 commented
@RuiWang1998 Same problem. --subbatch_size
seems not working on model 2, but fine with model 1.
KYQiu21 commented
Have you solved this? A length of 300AA proteins with subbatch_size 128 reported the memory error on an A100-40G.
smilenaderi commented
No, I'm just working with model 1