yaringal/DropoutUncertaintyCaffeModels

Activations in convolutional layer

MSusik opened this issue · 0 comments

Hi!

First of all, thanks for open-sourcing the experiments.

I'm currently trying to reproduce the results using a different framework. I noticed that in the definition of the convolutional networks (so called "LeNet" in this repo), the convolutional layers' activation function is linear (there is no ReLU behind). Is this intended?

Thanks,
Mateusz