xmu-xiaoma666/External-Attention-pytorch

the danet attention model, when i change teh h w of feature,it is wrong

henbucuoshanghai opened this issue · 0 comments

if name == 'main':
input=torch.randn(3,256,7,7)
danet=DAModule(d_model=256,kernel_size=3,H=7,W=7)
print(danet(input).shape)

input=torch.randn(3,256,7,7) when i change input to 3,256,128,128),wrong
why?