SHI-Labs/Neighborhood-Attention-Transformer

Tiny Bug in nattencuda.py

z-jiaming opened this issue · 1 comments

Great Work!

Well, I found a small bug in nattencuda.py.

pad_r = max(0, self.window_size - W)

it should be self.kernel_size during padding if feature size is small than kernel_size

Thank you for your interest, and for bringing this to our attention.