philipperemy/keras-tcn

fully convolution layer

HouKun-github opened this issue · 2 comments

Why is this version of the one-dimensional fully convolution layer gone? The previous version had a one-dimensional full convolution, at least the original paper used full convolution to ensure that the sequence is not missing, can you explain?

@HouKun-github hey could it be linked to #133?

Yes, you are right. The 1-D fully convolutional layer is gone and I think it is because we found that removing it boosted the performance (or did not deteriorate the performance of the model, cf. Occam razor). I agree that we don't match the original paper 100% now because of this change.