dbfl0/Self-Attention-Factor-Tuning
Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks
PythonMIT
No issues in this repository yet.
Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks
PythonMIT
No issues in this repository yet.