philipperemy/keras-tcn

Activation: 'LeakyReLU'

Closed this issue · 1 comments

Is there a separate list of activation functions available? I searched model definition, but only [linear, sigmoid, and relu] are used, so I wonder if only these three are supported.

I want to use LeakyReLU, but is it a feature that is not supported?

Beste regards,
Jinwoo Jeon

@zinuok sorry for the late reply.

Yes (almost) all activations here are supported:

image
From https://keras.io/api/layers/

Example

from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense, Dropout, Embedding

from tcn import TCN

model = Sequential()
model.add(Embedding(2000, 128, input_shape=(10,)))
model.add(TCN(
    kernel_size=6,
    activation='LeakyReLU',          # <---------------- leaky relu, elu...
    dilations=[1, 2, 4, 8, 16, 32, 64]
))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))

model.summary()