Error in code or paper?
qwert1337 opened this issue · 1 comments
Hi, nice work first of all!
I stumbled across the relu activations in the code:
x = x + self.down3(self.relu(x_)) x_ = x_ + F.interpolate( self.compression3(self.relu(layers[2])), size=[height_output, width_output], mode='bilinear')
DDRNet/segmentation/DDRNet_23_slim.py
Line 312 in ba659f9
In the paper in Fig. 3 there are no activations after blocks, but only after the bilateral fusion. Now I'm wondering what is correct?
Hi, nice work first of all! I stumbled across the relu activations in the code:
x = x + self.down3(self.relu(x_)) x_ = x_ + F.interpolate( self.compression3(self.relu(layers[2])), size=[height_output, width_output], mode='bilinear')
DDRNet/segmentation/DDRNet_23_slim.py
Line 312 in ba659f9
In the paper in Fig. 3 there are no activations after blocks, but only after the bilateral fusion. Now I'm wondering what is correct?
Hi, please follow the code. The pre-trained models work well with the current codes. In fact, one relu won't affect performance a lot.