Eniac-Xie/PyConvNet

I am confused in your tutorial, do you consider the activation function?

Closed this issue · 4 comments

I am confused in your tutorial, do you consider the activation function?

The tutorial is from other's homepage, not mine. And would you like to describe your question in detail?

I mean that a activation function such as sigmoid function is adopted in the the weighted sum $W*X$. However, in the PDF tutorial, I find there is no activation function. Hence, it is reasonable if we don't use a activation function? Thanks, now I am a beginner in CNN.

In CNN model we usually use ReLU (part 5 in the PDF tutorial) instead of sigmoid as activation function. The sigmoid function has fallen out of favor in CNN, please refer to http://cs231n.github.io/neural-networks-1/#actfun for more detial about "sigmoid" and "ReLU"

Thanks, I got it. In the tutorial, a Conv layer is followded by a ReLu layer. Hence, the ReLu layer is served as a activation function. I am sorry that I have not read the turorial in a comprehensive way.