YujiaBao/ls

No activation function in MLP?

Wenlin-Chen opened this issue · 1 comments

Hi, thanks for making the code and data available for this interesting paper!

I have one question about your implementation of MLP. It says in the paper that ReLU activation functions are applied to the hidden units of the MLP, but there seems to be no activation functions in your implementation?

Yes, the ReLU activation was missing. Thank you for identifying this bug! It has been fixed now :)