lucidrains/x-transformers

Using Rotary Positional Encoding with Continuous Wrapper

Closed this issue · 1 comments

There seems to be an error in importing the class 'always' when using RoPE with the continuous wrapper:

        if not (use_abs_pos_emb and not attn_layers.has_pos_emb):
            self.pos_emb = always(0)

Importing 'always' from x_transformers.py seems to fix it.

oh yes, my bad

want to submit a PR? I'm in transit