ygean/SE_DenseNet

More variant architecture of se_densnet need to try and test

John1231983 opened this issue ยท 8 comments

Hi, thanks for sharing your experiment results. I checked and found that you may have some redundant code in the _Dense layer that adds thr seblock in of convolution. You added it in loop (for) and after first convolution. Why do you add seblock in _Dense_layer again? Thanks

ygean commented

@John1231983 Hi, I found redundant code which you pointed, I think it's my mistake. For testing se_densenet, I have write some scripts to test se_densenet on cifar10 datasset today. And I will test whether redundant code removed can influence train-test result.Let's wait a few days to see result, and I will update README in a week.

Good. And consider to remove seblock after transition block also. We often use seblock after denseblock only. Transition block only helps to reduce feature size

ygean commented

@John1231983 Yes, thank you for your suggestions, I will take it and make more comparative experiments.

ygean commented

@John1231983 Hi, John. I update my test result just now, pls check it. Thank you very much.

ygean commented

@John1231983 It's worthy to make a test, I will update new result after job done, pls keep watching.Thanks.

ygean commented

The new test result has updated.
I will release train and test code in a few days.

Good. That is what I expect. You also can try something as:

  1. Sebock in loop only after denseblock ( remove seblock in transittion)
  2. Seblock in loop only after transition block ( remove seblock in denseblock)

Actually, we do not know where seblock will be good for densenet architecture. In my opinion, the first case may get better result