Pinned Repositories
adaptive_svd
chase
Dynamic Sparsity Is Channel-Level Sparsity Learner [Neurips 2023]
Chase_cifar
data_prune_NKD
Generating-the-simple-shape-dataset
Generate a simple shape dataset with different colors, shapes, thicknesses, and heights.
html-resume
A single-page resume template completely typeset with HTML & CSS.
Knowledge-Elicitation-using-Deep-Metric-Learning-and-Psychometric-Testing
Codes for "Knowledge Elicitation using Deep Metric Learning and Psychometric Testing" (ECML 2020)
Lottery-pools
OWL
Official Pytorch Implementation of "Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity"
Sup-tickets
luuyin's Repositories
luuyin/OWL
Official Pytorch Implementation of "Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity"
luuyin/Lottery-pools
luuyin/Sup-tickets
luuyin/Knowledge-Elicitation-using-Deep-Metric-Learning-and-Psychometric-Testing
Codes for "Knowledge Elicitation using Deep Metric Learning and Psychometric Testing" (ECML 2020)
luuyin/chase
Dynamic Sparsity Is Channel-Level Sparsity Learner [Neurips 2023]
luuyin/Generating-the-simple-shape-dataset
Generate a simple shape dataset with different colors, shapes, thicknesses, and heights.
luuyin/luuyin.github.io
luuyin/adaptive_svd
luuyin/Chase_cifar
luuyin/data_prune_NKD
luuyin/html-resume
A single-page resume template completely typeset with HTML & CSS.
luuyin/ILM-VP
[CVPR23] "Understanding and Improving Visual Prompting: A Label-Mapping Perspective" by Aochuan Chen, Yuguang Yao, Pin-Yu Chen, Yihua Zhang, and Sijia Liu
luuyin/TU-e-deeplearning-2020
Course materials for 2IMM10 (2019-GS4) Deep Learning, TU/e
luuyin/TU-e-deeplearning-2021
Repository for Tutorials and Practicals part of the Deep Learning course at TU/e
luuyin/OwLore
Official Pytorch Implementation of "OwLore: Outlier-weighed Layerwise Sampled Low-Rank Projection for Memory-Efficient LLM Fine-tuning" by Pengxiang Li, Lu Yin, Xiaowei Gao, Shiwei Liu
luuyin/prune_opt
luuyin/SMC-Bench
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
luuyin/TU-e-deeplearning-2022
Repository for Tutorials and Practicals part of the Deep Learning course at TU/e
luuyin/wanda
A simple and effective LLM pruning approach.