lucidrains/x-transformers

Question: masking in token shifting

Opened this issue · 1 comments

In token shifting, you explicitly zero out masked items:

if exists(mask):
t = t.masked_fill(~mask[..., None], 0.)

Is this strictly necessary? Since we are shifting right, the shifted tokens should be valid right?
Or is this accounting for items masked on the left? In which case you might be shifting and adding with an invalid token?

I noticed that RecurrentMemoryTransformer didn't do this:

https://github.com/lucidrains/recurrent-memory-transformer-pytorch/blob/d45ef72a40324c6224ffacb890d5593a69db73de/recurrent_memory_transformer_pytorch/recurrent_memory_transformer.py#L65-L70

Hence why I'm asking if it's strictly necessary.

@pfeatherstone i think i allow for bidirectional shifting, maybe that's why

i can check later