THUwangcy/ReChorus

In the S3Rec MIP task "mask" token may be not 0

Closed this issue · 1 comments

BAWWH commented

In the BERT "mask" is a special token and different from padding token. The padding token won't calculate attention weight, while "mask" token do it. https://github.com/google-research/bert/blob/eedf5716ce1268e56f0a50264a88cafad334ac61/create_pretraining_data.py#L394
if prob < self.model.mask_ratio: mask_seq[idx] = 0 neg_item[idx] = self._neg_sample(seq)

Thanks for pointing out this! S3Rec is still a developing model and we will fix this bug in the next commit.