snuspl/dolphin

Separate activation function from fully connected layer

Closed this issue · 0 comments

Now, activation function is a part of FullyConnectedLayer. However, in many neural network models such as AlexNet and GoogleNet, activation function ReLU is used for the outputs of convolutional layers and fully connected layers.
We can add the activation function feature to other layers. Nonetheless, in order to reduce duplicates and be more general, it is better to have separate layers for activation functions like Caffe and Apache SINGA.