soobinseo/Transformer-TTS

ValueError: optimizing a parameter that doesn't require gradients

Opened this issue · 2 comments

ValueError: optimizing a parameter that doesn't require gradients

Can I see what part of code occur the error?

Using torch==0.4.1 seems to resolve this. Not sure yet if this would affect anything else later on.