HITESHLPATEL/Prime
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.
PythonNOASSERTION
Watchers
No one’s watching this repository yet.
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.
PythonNOASSERTION
No one’s watching this repository yet.