pplonski/keras2cpp

model with function API

Opened this issue · 2 comments

Hi this is really great work!
I just want to provide something I find might be useful to others.
I find that in order to dump model correctly. We need to build model with sequential model and add Activation layer separately.
For example, the second method will get dumped correctly. While for the first method, dumped model has no Activation layer.

from keras.models import Sequential, Model
from keras.layers import Input, Dense, Dropout, Activation
def get_model_by_sequential():
        model = Sequential()
        model.add(Dense(64,input_dim=15,init = 'uniform',activation='relu'))
        model.add(Dense(128,init = 'uniform',activation='relu'))
        model.add(Dense(256,init = 'uniform',activation='relu'))
        model.add(Dense(1,activation='sigmoid'))
        return model

def get_model_by_sequential_with_separate_activation():
        model = Sequential()
        model.add(Dense(64,input_dim=15,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(128,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(256,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(1))
        model.add(Activation('sigmoid'))
        return model

def get_model_by_functional_API():
        a = Input(shape=(15,))
        b = Dense(64,input_dim=15,init = 'uniform',activation='relu')(a)
        b = Dense(128,init = 'uniform',activation='relu')(b)
        b = Dense(256,init = 'uniform',activation='relu')(b)
        b = Dense(1,activation='sigmoid')(b)
        model = Model(input=a, output=b)
        return model

Thanks for that information! I think it should be easy to handle both situations. Would you like to prepare changes for it?

Hi, I will work on that. I also added 'sigmoid' Activation layer for binary classification.