jpuigcerver/pytorch-baidu-ctc

documentation

Opened this issue · 4 comments

Thanks for the bindings!

I simply wanted to point out that you could add to the documentation:

  • y, xs and ys need to be on the cpu
  • x represents non-logits (no log-softmax applied)

can I use this with the same input as torch.nn.CTCloss?

Hi @WenmuZhou,

Not exactly. You need to make sure that you place the tensors in the appropriate devices.

Take a look at this piece of code, were I use both implementations of the CTC loss:
https://github.com/jpuigcerver/PyLaia/blob/41d2cc41d742e7ab336393fde8f56585ff49ee52/laia/losses/ctc_loss.py#L350

tks@jpuigcerver

Hi, @jpuigcerver ask for one question: the input pred needs the log_softmax output?