keras-team/keras-tuner

How to use keras_tuner to improve 1d-cnn model without Sequential() or model. add

Opened this issue · 2 comments

Describe the bug

this is my code:

fea_cnt = 24 # number of features
numb = 3 # type of classcification

def build_model(fea_cnt, numb):
K.clear_session()
METRICS = [
keras.metrics.TruePositives(name='tp'),
keras.metrics.FalsePositives(name='fp'),
keras.metrics.TrueNegatives(name='tn'),
keras.metrics.FalseNegatives(name='fn'),
keras.metrics.BinaryAccuracy(name='accuracy'),
keras.metrics.Precision(name='precision'),
keras.metrics.Recall(name='recall'),
keras.metrics.AUC(name='auc'),
keras.metrics.AUC(name='prc', curve='PR'), # precision-recall curve
]

inputs2 = Input(shape=(fea_cnt,), dtype='float32') 
embds2 = Reshape((24, 1))(inputs2)
embds2 = Conv1D(64, 7, strides=2, padding='same', activation='relu')(embds2) 
embds2 = Conv1D(256, 7, strides=2, padding='same', activation='relu')(embds2)
#embds2 = Normalization(embds2)
embds2 = MaxPooling1D(pool_size=3, strides=None, padding='same')(embds2)
embds2 = Dropout(0.2)(embds2)
embds2 = Flatten()(embds2)
embds2 = Dense(256, activation='relu')(embds2)
embds2 = Dropout(0.2)(embds2)

concat = Dense(512, activation='relu')(embds2)
outputs = Dense(numb, activation='softmax')(concat)

model = Model(inputs=[inputs2], outputs=outputs)
optimizer = tf.keras.optimizers.Adam(learning_rate=0.001)
model.compile(loss='categorical_crossentropy', optimizer=optimizer, metrics=METRICS)  
model.summary()
return model

To Reproduce

Expected behavior

Additional context

Would you like to help us fix it?

I used this site for the first time and i'm sorry that there are some problems with the typesetting. my question is: my 1d-cnn code is quite special and it does not use th code like Sequential() or model. add(). if anyone is interested, I have upload this code below. However, once I want to use the keras_tuner to improve me model i cannot figure out how to tell the code the parameters that i want to optimize. I found many code while they are used Sequential() or model. add() which do not serve as my direct reference. Thank you very much for your help.

You can define it in custom keras.Model class right?
which parameters do you want to tune? because when kernel size and filters are changed the next layer should have appropriate kernel and filters else the tuning would fail