hujie-frank/SENet

SeNet forward process remove squeeze-and-excitation architecture?

chaipangpang opened this issue · 1 comments

Hi,i have some consideration about squeeze-and-excitation architecture while forward process,when senet is trained, squeeze-and-excitation architecture had modify the weights of resnet,so,it is necessary for retain squeeze-and-excitation architecture when forward process?

The channel-wise weights in each SE module depend on the input featuremaps. It is not able to squash SE modules by any modification on the weights of backbone from a well-trained SE model.