hp-l33/AiM

Positional Encoding

Closed this issue · 1 comments

Thank you very much for your work!

Where can I find more details about your implementation of positional encoding in the model?

Hi, the positional encoding can be found within GPT2Embeddings located in models/stage2/mixer_seq_simple.py.