/torch-multi-head-attention

Multi-head attention in PyTorch

Primary LanguagePythonMIT LicenseMIT

Watchers