philipperemy/keras-tcn

Support for Bidirectional Layer

daniel-v-e opened this issue · 3 comments

Could this TCN implementation hypothetically be modified so that it can be wrapped by tf.keras.layers.Bidirectional ?

Currently it is not possible, as layers need the go_backwards attribute in order to be wrapped by the bidirectional layer - see https://www.tensorflow.org/api_docs/python/tf/keras/layers/Bidirectional

@daniel-v-e I am not sure how hard it is to implement it. I forgot a bit how Bidirectional works. Is it concatenating the outputs at each step from two RNNs going forward and backwards? Does it involve "merging" the states (cells) when doing that (I don't think so but just asking)? Because TCN don't have states compared to GRU or LSTM.

If it's just concatenating outputs at each step where each TCN is independent, then it's def do-able. And it would not be so hard. I guess we can just do:

  • We need to add go_backwards in the constructor.
  • If go_backwards=True, we need to flip the time dimension before the TCN layers are called.
  • return_state could just be an empty list?

@daniel-v-e Bidirect has been implemented.

You can check it here: a412190.

Ref: https://keras.io/examples/nlp/bidirectional_lstm_imdb/

Pushed in 3.5.0.