/BiBloSA-pytorch

Re-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.

Primary LanguagePython

No issues in this repository yet.