/Self-Attention-Factor-Tuning

Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks

Primary LanguagePythonMIT LicenseMIT

No issues in this repository yet.