sunnyxuejj
Ph.D student at the Networking Technology Research Centre, Institute of Computing Technology of Chinese Academy of Sciences
sunnyxuejj's Stars
bytedance/ABQ-LLM
An acceleration library that supports arbitrary bit-width combinatorial quantization operations
yuhuixu1993/qa-lora
Official PyTorch implementation of QA-LoRA
haotian-liu/LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
BradyFU/Awesome-Multimodal-Large-Language-Models
:sparkles::sparkles:Latest Advances on Multimodal Large Language Models
liguodongiot/llm-action
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
hahnyuan/RPTQ4LLM
Reorder-based post-training quantization for large language model
HuangOwen/Awesome-LLM-Compression
Awesome LLM compression research papers and tools.
IST-DASLab/sparsegpt
Code for the ICML 2023 paper "SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot".
SMILELab-FL/FedPETuning
huggingface/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
alibaba/FederatedScope
An easy-to-use federated learning platform
huggingface/pytorch-image-models
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNetV4, MobileNet-V3 & V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
Hhhhhhao/Noisy-Model-Learning
LAION-AI/CLIP_benchmark
CLIP-like model evaluation
qingsongedu/Awesome-TimeSeries-SpatioTemporal-LM-LLM
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
FLAIR-THU/VFLAIR
THU-AIR Vertical Federated Learning general, extensible and light-weight framework
wanglun1996/secure-robust-federated-learning
imperial-qore/TranAD
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
omarfoq/knn-per
Official code for "Personalized Federated Learning through Local Memorization" (ICML'22)
AIoT-MLSys-Lab/FedRolex
[NeurIPS 2022] "FedRolex: Model-Heterogeneous Federated Learning with Rolling Sub-Model Extraction" by Samiul Alam, Luyang Liu, Ming Yan, and Mi Zhang
shaoxiongji/federated-learning
A PyTorch Implementation of Federated Learning http://doi.org/10.5281/zenodo.4321561
NUAA-SmartSensing/FedModule
联邦学习模块化框架,支持各类FL。A universal federated learning framework, free to switch thread and process modes
akhilmathurs/orchestra
Source code for the ICML 2022 paper: "Orchestra: Unsupervised Federated Learning via Globally Consistent Clustering"
mit-han-lab/amc
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
vaseline555/SuPerFed
(SIGKDD 2022) Connected Low-Loss Subspace Learning for a Personalization in Federated Learning (https://arxiv.org/abs/2109.07628)
youngfish42/Awesome-FL
Comprehensive and timely academic information on federated learning (papers, frameworks, datasets, tutorials, workshops)
zhuangdizhu/FedGen
Code and data accompanying the FedGen paper
bibikar/feddst
Federated Dynamic Sparse Training
Shiweiliuiiiiiii/In-Time-Over-Parameterization
[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
RAIVNLab/STR
Soft Threshold Weight Reparameterization for Learnable Sparsity