jacobgil/keras-grad-cam

question: why replace keras.activations.relu to tf.nn.relu

lujingqiao opened this issue · 0 comments

fitst, thank for share the code, learn a lot.
but i has a question: can you explain why repace keras.activations.relu?
thank you!

`def modify_backprop(model, name):
g = tf.get_default_graph()
with g.gradient_override_map({'Relu': name}):

    # get layers that have an activation
    layer_dict = [layer for layer in model.layers[1:] if hasattr(layer, 'activation')]

    # replace relu activation
    for layer in layer_dict:
        if layer.activation == keras.activations.relu:
            layer.activation = tf.nn.relu

    # re-instanciate a new model
    new_model = VGG16(weights='imagenet')
return new_model`