Calculation of "beam_scores" and gradient
Ivan-Fan opened this issue · 2 comments
Hi! It's really great work!I am now learning and working on ASR related tasks and really appreciate your effort!
However, I still have some questions about the outputs of the functions. I noticed that "beam_scores" is a tensor but without grad_fn. Is there any way that I could preserving the computation graph in the meantime? So that I could directly use "beam_scores" to define new loss in my task.
Actually I found that tensorflow has an implementation called tf.nn.ctc_beam_search_decoder, which they could return a beam_score with gradient preserved. So I was wondering if there is any similar implementation in PyTorch?
Also, could you explain a little bit on this:
// compute aproximate ctc score as the return score, without affecting the
// return order of decoding result. To delete when decoder gets stable.
have you solved this problem? I got the same question with you
@codeking233 No, I have not:( Do you find any solutions?