csrhddlam/axial-deeplab

About local constraints

YLFF opened this issue · 2 comments

YLFF commented

I'm sorry if i understand the code in a wrong way, did this repo implement the local constraints part for larger input size?
I assume that the AxialAttention Module can only accept the input feature map size which equals to kernel_size*kernel_size , am i right?

Yes. This repo does not implement local axial attention. So the module only support the feature map size which is kernel_size * kernel_size.

YLFF commented

Yes, thank you for answering!