sbaldu/Neural_network_HEP

Allow different activation functions for different layers

Opened this issue · 0 comments

sbaldu commented

Right now the activation function is a template parameter of the network, which means that it's the same for all the hidden layer and, in particular, for the output layer.
It would be better and more general to allow different layers to use different activation functions.