nairouz's Stars
UCDvision/NOLA
Code for NOLA, an implementation of "nola: Compressing LoRA using Linear Combination of Random Basis"
yxli2123/LoSparse
VijayLingam95/SVFT
iboing/CorDA
CorDA: Context-Oriented Decomposition Adaptation of Large Language Models (NeurIPS 2024)
Chaos96/fourierft
KohakuBlueleaf/LyCORIS
Lora beYond Conventional methods, Other Rank adaptation Implementations for Stable diffusion.
Zeju1997/oft
Official implementation of "Controlling Text-to-Image Diffusion by Orthogonal Finetuning".
DaShenZi721/HRA
GraphPKU/PiSSA
PiSSA: Principal Singular Values and Singular Vectors Adaptation of Large Language Models(NeurIPS 2024 Spotlight)
BorealisAI/flora-opt
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
McGill-NLP/polytropon
zhaoxin94/awesome-domain-adaptation
A collection of AWESOME things about domian adaptation
calpt/awesome-adapter-resources
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
NVlabs/DoRA
[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
marcellodebernardi/loss-landscapes
Approximating neural network loss landscapes in low-dimensional parameter subspaces for PyTorch
JieShibo/PETL-ViT
[ICCV 2023] Binary Adapters, [AAAI 2023] FacT, [Tech report] Convpass
ZhangYuanhan-AI/NOAH
[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
huggingface/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
huawei-noah/Efficient-NLP
stan-anony/derivative_free_lora_rank
QingruZhang/AdaLoRA
AdaLoRA: Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning (ICLR 2023).
ecnu-sea/SEA
SEA is an automated paper review framework capable of generating comprehensive and high-quality review feedback with high consistency for papers, thereby assisting researchers in improving the quality of their work.
abdouskamel/HYPER-MGE
Code for the paper "A Geometric Perspective for High-Dimensional Multiplex Graphs", with Python and PyTorch
MaxZanella/CLIP-LoRA
An easy way to apply LoRA to CLIP. Implementation of the paper "Low-Rank Few-Shot Adaptation of Vision-Language Models" (CLIP-LoRA) [CVPRW 2024].
openai/CLIP
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
uestclbh/PC-Conv
ChenJY-Count/PolyGCL
PyTorch implementation of "PolyGCL: GRAPH CONTRASTIVE LEARNING via Learnable Spectral Polynomial Filters"
yueliu1999/Awesome-Deep-Group-Recommendation
Awesome Deep Group Recommendation is a collection of SOTA, novel deep group recommendation methods (papers, codes, and datasets).
nairouz/DynAE