The difference of RFBNet in RFB_Net_vgg.py and Fig.5 ?
LightToYang opened this issue · 9 comments
Hello, thank you for code releasing.
There are two branches from fc7 layer output in Fig.5, one is the input of RBF and another is the input of RBF(stride 2).
# apply vgg up to fc7
for k in range(23, len(self.base)):
x = self.base[k](x)
# apply extra layers and cache source layer outputs
for k, v in enumerate(self.extras):
x = v(x)
if k < self.indicator or k%2 ==0:
sources.append(x)
But, in RFB_Net_vgg.py, the output x from fc7 layer is the input of RFB layer, and then, the output of RFB layer is the input of RFB(stride 2) layer. There might not be two branches. And the architecture of RFBNet might be like the picture below?
@LightToYang Yes, you are right! Thanks a lot for pointing out this mistake. And actually all my experiments are using the architecture like your picture. I just ignore this details when painting Fig.5, sorry for the confusion~
Are you sure that RFB-s has applied to the conv4_3? I can not find the operations, there are only extra['300'] = [1024,512,256],corresponding to RFB;What's more, in you paper Fig.4, the RFB consists of rate = [1,3,5], while the code is [2,3,5]
@zHanami RFB-s is adding as self.Norm in RFB.
@ruinmessi Thanks, I just mis-trust it as the L2norm in ssd,but can you figure out the rate in your paper and code, which I refer.
@zHanami Thanks, I check the code, you are right, the rate of the code is different from the paper.
BTW, there is no any fc layer in deeplab-v1 or your code, If I haven't mistake.
why it is 23? for k in range(23, len(self.base)):
why it is 23? for k in range(23, len(self.base)):
hello, you can have a detail look of the vgg, the 23 because we want to get the conv4_3 layer
@LightToYang
Did you solve your problem?
I agree with you.
The code should be changed like below or sth.
# apply extra layers and cache source layer outputs
for k, v in enumerate(self.extras):
if k == 0:
fc7 = x
x = v(x)
if k < self.indicator or k % 2 == 0:
features.append(x)
if k == 0:
x = fc7
I got 80.12% after that.