eonu/sequentia

Allow torch.Tensor inputs to DeepGRU

eonu opened this issue · 1 comments

eonu commented

The _EncoderNetwork class should be modified to allow a B x T x D tensor input (when all sequence lengths are the same).

This will involve modifying forward(x, n_lengths) to make n_lengths optional.

  • If it is set to None then x should be a B x T x D tensor.
  • If n_lengths is set, then x should be a padded sequence object.

This will also likely require adding conditionals for the following lines:

x_packed = pack_padded_sequence(x, x_lengths.cpu(), batch_first=True)

h_padded = pad_packed_sequence(h_packed, batch_first=True, padding_value=0.0, total_length=max(x_lengths))

eonu commented

Closing as classifiers.rnn module was removed in #215.