SC-Conv Implementation on Keras
omerkolcak opened this issue · 0 comments
omerkolcak commented
Hello to you all, thank you for the implementation. I dont have much experience on the deep learning area. I tried to implement the self calibrated conv layer on Keras 2, but i dont know if it is correct or not. Also, i added relu activation after the concatenation, i wonder if it is good practice or should i add relu activation after every conv2d layers. Any feedback is appreciated, thanks.
def build_selfcalibrated_conv_graph(depth,layer_name,split_ratio=0.5,filters=64):
input_layer = KL.Input(shape=[None,None,depth])
split_idx = int(depth * split_ratio)
x1 = KL.Lambda(lambda x: x[:,:,:,:split_idx])(input_layer)
x2 = KL.Lambda(lambda x: x[:,:,:,split_idx:])(input_layer)
# x1 path
identity = x1
output_x1_a = KL.Conv2D(filters,(3,3),padding="same",name=f"{layer_name}_k3_conv")(x1)
output_x1_a = KL.BatchNormalization(name=f"{layer_name}_k3_bn")(output_x1_a)
output_x1_b = KL.AveragePooling2D(pool_size=(4, 4), strides=4,name=f"{layer_name}_k2_avg")(x1)
output_x1_b = KL.Conv2D(filters,(3,3),padding="same",name=f"{layer_name}_k2_conv")(output_x1_b)
output_x1_b = KL.BatchNormalization(name=f"{layer_name}_k2_bn")(output_x1_b)
output_x1_b = KL.UpSampling2D(size=(4,4))(output_x1_b)
output_x1_b = KL.Activation('sigmoid')(KL.add([output_x1_b,identity]))
output_x1_ab = KL.Multiply()([output_x1_a,output_x1_b])
output_x1_ab = KL.Conv2D(filters,(3,3),padding="same",name=f"{layer_name}_k4_conv")(output_x1_ab)
y1 = KL.BatchNormalization(name=f"{layer_name}_k4_bn")(output_x1_ab)
# x2 path
y2 = KL.Conv2D(filters,(3,3),padding="same",name=f"{layer_name}_k1_conv")(x2)
y2 = KL.BatchNormalization(name=f"{layer_name}_k1_bn")(y2)
# concatenate y1 and y2
output = KL.Concatenate(axis=-1)([y1,y2])
output = KL.Activation('relu')(output)
return KM.Model(inputs=input_layer, outputs=output, name=layer_name) ```