Positional Encoding
Closed this issue · 1 comments
Bezdarnost commented
Thank you very much for your work!
Where can I find more details about your implementation of positional encoding in the model?
hp-l33 commented
Hi, the positional encoding can be found within GPT2Embeddings
located in models/stage2/mixer_seq_simple.py
.