It's weird when I use distributed code
Feywell opened this issue · 0 comments
Feywell commented
I try to change code to use torch.ditributed. But, It will done without errors in
SEAN/models/networks/normalization.py
Line 119 in 04c7536
I found it will done with
torch.randn(x.shape[0], x.shape[3], x.shape[2], 1).cuda()
. It is weird.