What does A32/A64 mean?
Vincent-luo opened this issue · 2 comments
Vincent-luo commented
When using VPD for semantic segmentation, does A32/A64 mean that only use size 32/64 attention maps? In the source code, attn16, attn32, attn64 are all used, how can i reproduce the results of VPD_A32 and VPD_A64?
wl-zhao commented
Thanks for your interest in our work! A32 means attn16+attn32 are used and attn64 means attn16+attn32+attn64 are used. A64 is used by default and you can set max_attn_size=32 to the UNetWrapper to enable A32.
Vincent-luo commented
Thanks for your reply! I'll try it.