A highly efficient fine-tuning technique for large-scale neural networks (16x more parameter-efficient training than regular fine-tuning).
Code for the paper Self-Attention Factor-Tuning for Parameter-Efficient Fine-Tuning
Easily install SAFT using pip and get started with a simple example.
pip install saft
from saft.saft import saft
if __name__ == "__main__":
saft_instance = saft(
model='vit_base_patch16_224',
num_classes=get_classes_num('oxford_flowers102'),
validation_interval=1,
rank=3,
scale=10
)
# Replace with your PyTorch DataLoader objects
# train_dl, test_dl = [your data in a pytorch dataloader]
# saft_instance.upload_data(train_dl, test_dl)
saft_instance.train(10)
trained_model = saft_instance.model
To run tests on the VTAB-1K dataset, follow these steps:
- Visit the SSF Data Preparation page to download the VTAB-1K dataset.
- Place the downloaded dataset folders in
<YOUR PATH>/SAFT/data/
.
For a quick start, download the pretrained ViT-B/16 model:
- Download ViT-B/16
- Place the downloaded model in
<YOUR PATH>/SAFT/ViT-B_16.npz
.