karim-ahmed/linear-attention-transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
PythonMIT
No issues in this repository yet.
Transformer based on a variant of attention that is linear complexity in respect to sequence length
PythonMIT
No issues in this repository yet.