xmu-xiaoma666/External-Attention-pytorch

Wrong BAM module in your code

Luo-Z13 opened this issue · 0 comments

I notice the code in BAM is :
def forward(self, x): b, c, _, _ = x.size() sa_out=self.sa(x) ca_out=self.ca(x) weight = self.sigmoid(sa_out*ca_out) weight=self.sigmoid(sa_out+ca_out) # here out=(1+weight)*x return out
here should be '*' instead of '+'.