leaderj1001/LambdaNetworks

No relative position embedding?

Opened this issue · 0 comments

Hi, @leaderj1001

Thanks for your code. However, I am a little bit confused about this line

lambda_p = torch.einsum('ku,bvun->bkvn', self.embedding, values)

It seems like did not use relative position embedding when strategy is not local context. What's the meaning of this line?