Compatibility with normal multilayer perceptron
masakinakada opened this issue · 5 comments
masakinakada commented
Hi @pplonski Thank you so much for this amazing keras converter.
I have MLP trained using keras and want to use in C++.
Is this working with MLP which has relu and linear as an activation function?
I noticed that you mentioned you focused on CNN.
Thank you very much in advance.
Masaki
pplonski commented
Hi! Right now, there is softmax and Relu activation implemented. Adding linear activation should be easy. Would you add it? In function keras::LayerActivation::compute_output
masakinakada commented
Yes I added it! As you said, linear layer was easy.
Just wanted to make sure other parts are fine too!
Thanks you @pplonski !
pplonski commented
Great! Would you like to pull your changes?
masakinakada commented
push my change?
pplonski commented
Sure, please make pull request.