memory-efficient-tuning

There are 5 repositories under memory-efficient-tuning topic.

  • Paranioar/Awesome_Matching_Pretraining_Transfering

    The Paper List of Large Multi-Modality Model, Parameter-Efficient Finetuning, Vision-Language Pretraining, Conventional Image-Text Matching for Preliminary Insight.

  • Paranioar/UniPT

    [CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"

    Language:Python63151
  • BorealisAI/flora-opt

    This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.

    Language:Python58353
  • misonsky/HiFT

    memory-efficient fine-tuning; support 24G GPU memory fine-tuning 7B

    Language:Python18202
  • Paranioar/SHERL

    [ECCV2024] The code of "SHERL: Synthesizing High Accuracy and Efficient Memory for Resource-Limited Transfer Learning"