/super-attention

An implementation of super attention in PyTorch.

Primary LanguagePythonMIT LicenseMIT

Watchers