Activations in convolutional layer
MSusik opened this issue · 0 comments
MSusik commented
Hi!
First of all, thanks for open-sourcing the experiments.
I'm currently trying to reproduce the results using a different framework. I noticed that in the definition of the convolutional networks (so called "LeNet" in this repo), the convolutional layers' activation function is linear (there is no ReLU behind). Is this intended?
Thanks,
Mateusz