Softmax is missing in the non-local module
Opened this issue · 0 comments
HIT-cwh commented
First of all, thank you for sharing your valuable code.
As shown in fig2(c) of the paper, the softmax function is used to convert the matrix product to the dot-product attention. But it is missing in the NonLocalBlockND
of the released code. Moreover, it may be time-consuming to obtain this attention matrix as the matrix size is (300x300, 150x150).
Looking forward to your reply. Thanks.