memory-efficient-tuning

There are 5 repositories under memory-efficient-tuning topic.

  • Paranioar/Awesome_Matching_Pretraining_Transfering

    The Paper List of Large Multi-Modality Model (Perception, Generation, Unification), Parameter-Efficient Finetuning, Vision-Language Pretraining, Conventional Image-Text Matching for Preliminary Insight.

  • BorealisAI/flora-opt

    This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.

    Language:Python83365
  • Paranioar/UniPT

    [CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"

    Language:Python65151
  • misonsky/HiFT

    memory-efficient fine-tuning; support 24G GPU memory fine-tuning 7B

    Language:Python19202
  • Paranioar/SHERL

    [ECCV2024] The code of "SHERL: Synthesizing High Accuracy and Efficient Memory for Resource-Limited Transfer Learning"

    Language:Python4100