locuslab/TCN

Training on variable-length sequences

abedidev opened this issue · 1 comments

Is there any way to handle variable-length sequences and train TCN models on them (the LSTM in PyTorch and also the Keras implementation of TCN can handle variable-length sequences)?

Outside of TCN, you can try Dynamic Time Warping to make the sequences the same length.