/Self-Attention-Factor-Tuning

Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.