dbfl0/Self-Attention-Factor-Tuning
Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks
PythonMIT
Stargazers
No one’s star this repository yet.
Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks
PythonMIT
No one’s star this repository yet.