ZJCV/NetworkSlimming

question about pruning at Residual connection of ResNet

Opened this issue · 5 comments

Dear author, this repo is a wonderful work. I am recently trouble by the residual connection when pruning using network slim. i read a little of your implementation and assume that there seem no specific manipulation to align the bottleneck input and residual ouput or in some cases, the downsampled input and the residual output. In the end, we have to add them together in ResNet. so alignment seems necessary.
Hope to get your answer.

Dear author, this repo is a wonderful work. I am recently trouble by the residual connection when pruning using network slim. i read a little of your implementation and assume that there seem no specific manipulation to align the bottleneck input and residual ouput or in some cases, the downsampled input and the residual output. In the end, we have to add them together in ResNet. so alignment seems necessary.
Hope to get your answer.

hi @Yindong-Zhang , in this repos, channel align is necessary when pruning model.

For ResNet, each bottleneck keep the same input/output (except the input of the first bottleneck) channels in every stage, so before pruning one stage, I counted each bottleneck's output pruning channel and choose the biggest one as the output channel number

hi @Yindong-Zhang , in this repos, channel align is necessary when pruning model.

For ResNet, each bottleneck keep the same input/output (except the input of the first bottleneck) channels in every stage, so before pruning one stage, I counted each bottleneck's output pruning channel and choose the biggest one as the output channel number

what happened the case when bottlenecks' output pruning channels don't overlap with each other. choose the biggest one seems to be an approximation.

hi @Yindong-Zhang , in this repos, channel align is necessary when pruning model.
For ResNet, each bottleneck keep the same input/output (except the input of the first bottleneck) channels in every stage, so before pruning one stage, I counted each bottleneck's output pruning channel and choose the biggest one as the output channel number

what happened the case when bottlenecks' output pruning channels don't overlap with each other. choose the biggest one seems to be an approximation.

yes, because I want to fit the architecture of ResNet. If you have another solution, I look forward to knowing it

hi @Yindong-Zhang , in this repos, channel align is necessary when pruning model.
For ResNet, each bottleneck keep the same input/output (except the input of the first bottleneck) channels in every stage, so before pruning one stage, I counted each bottleneck's output pruning channel and choose the biggest one as the output channel number

what happened the case when bottlenecks' output pruning channels don't overlap with each other. choose the biggest one seems to be an approximation.

yes, because I want to fit the architecture of ResNet. If you have another solution, I look forward to knowing it

thanks for your answer, actually i have no better solution. but comparative experimental may serve as a approving evidence for your strategy. hope to see it happen.

hi @Yindong-Zhang , in this repos, channel align is necessary when pruning model.
For ResNet, each bottleneck keep the same input/output (except the input of the first bottleneck) channels in every stage, so before pruning one stage, I counted each bottleneck's output pruning channel and choose the biggest one as the output channel number

what happened the case when bottlenecks' output pruning channels don't overlap with each other. choose the biggest one seems to be an approximation.

yes, because I want to fit the architecture of ResNet. If you have another solution, I look forward to knowing it

thanks for your answer, actually i have no better solution. but comparative experimental may serve as a approving evidence for your strategy. hope to see it happen.

enen, actually i have done some experiments about this repos. About the comparison between this repos and original paper, maybe yes maybe no. Again, the implementation of this ResNet pruning is just because I want to fit the architecture