CReLU and BatchNorm
yulizhou opened this issue · 3 comments
yulizhou commented
Hi, I'm reading the paper and curious about your implementation.
CReLU layer seems defined but not used. Instead, the code implements it again in the layer construction.
Also, the paper has a batch norm layer but it's not implemented.
What is the consideration for this implementation? Better performance?
Thanks
lxg2015 commented
Whichever way is ok about CReLU . With the bn layer, faceboxes should have better results, I just forget to add the batch norm layer, thanks
abeardear commented