tengshaofeng/ResidualAttentionNetwork-pytorch

A Inputsize Question

onlyonewater opened this issue · 1 comments

Hi @tengshaofeng ,thanks ,But I have a question,in attention_module.py ,the class AttentionModule_stage0 inputsize is 112112,but in the class AttentionModule_stage1 the inputsize is 5656,is any maxpool layer used in the middle?I think it's not mentioned in the paper.

Sorry,I konw this this question ,please close this issue.Thanks!