[3.3.1] Normalization broken
Luux opened this issue · 1 comments
Luux commented
Describe the bug
Setting use_batch_norm or use_weight_norm to True results in "AttributeError: Only one normalization can be specified at once."
use_layer_norm works = True works as expected. This occurs in keras-tcn==3.3.1, it also works as expected in keras-tcn==3.3.0.
Paste a snippet
import tensorflow as tf
from tcn import TCN
# define input shape
max_len = 100
max_features = 50
# make model
model = tf.keras.models.Sequential(
layers=[
tf.keras.layers.Embedding(max_features, 16, input_shape=(max_len,)),
TCN(
nb_filters=12,
dropout_rate=0.5,
kernel_size=6,
dilations=[1, 2, 4],
use_weight_norm=True,
),
tf.keras.layers.Dense(units=1, activation="sigmoid"),
]
)
results in
Traceback (most recent call last):
File "build_simple_tcn.py", line 13, in <module>
TCN(
File [...]/miniconda3/envs/3.8/lib/python3.8/site-packages/tcn/tcn.py", line 235, in __init__
raise ValueError('Only one normalization can be specified at once.')
ValueError: Only one normalization can be specified at once.
Dependencies
Specify which version of tensorflow you are running.
tensorflow==2.3.0
Luux commented
Oh. This is not really a bug. The exception works as expected. However, in 3.3.1, use_layer_norm defaulted to True for some reason, while both older versions and the current master do not have any normalization activated per default.