about stop token
Closed this issue · 1 comments
Hi thank you for sharing this awesome project!
I was reading your code and I found a code that I can't understand.
first, the gate_batch in dataloader means the mel's active part?
if it is, in dataloader.py, line 136
"for i, x in enumerate(batch): gate_batch[i, len(x[1])-1:] = 1"
if this code executed, each list in gate batch means active part except the last frame in mel.
is there any purpose of this part?
Hi, sorry for the delay!
"gate_batch" contains labels that tells the model when to stop, the model is trained to predict when to stop by activating a sigmoid unit that predicts "0" for continue, and "1" for stop.
In dataloader.py line 135:
"gate_batch = torch.FloatTensor(len(batch), max_target_len).zero_()",
"gate_batch" was first initialized as a zero vector, then I fill in "1"s for all the successive frames beyond the last frame (including paddings).
The purpose of "gate_batch" is to provide training labels for the gate unit.