S_loss = 0 ... why?
Closed this issue · 2 comments
DrBrule commented
On line 352 of train_finetune_accelerate.py
- s_loss = 0
is hardcoded in. When doing fine-tuning (during the later SLM epochs) I see something like:
Step [90/144], Loss: 0.26600, Disc Loss: 3.64881, Dur Loss: 0.77317, CE Loss: 0.03942, Norm Loss: 0.54697, F0 Loss: 1.49505, LM Loss: 1.04758, Gen Loss: 6.65366, Sty Loss: 0.07343, Diff Loss: 0.29013, DiscLM Loss: 0.05875, GenLM Loss: 0.94493, SLoss: 0.00000, S2S Loss: 0.17429, Mono Loss: 0.04844
So, SLoss is coming back as 0 because it's hardcoded in and never updated. Is there a reason that is even being displayed?
qa6300525 commented
Since s_loss is never used, it is just initialized to 0
DrBrule commented
alright! So just some vesitigial code then. Thanks for the clarification.