Pinned Repositories
DLDR
[TPAMI 2023] Low Dimensional Landscape Hypothesis is True: DNNs can be Trained in Tiny Subspaces
DLDR_imagenet
F-SAM
[CVPR 2024] Friendly Sharpness-Aware Minimization
Flat-LoRA
Flat-LoRA: Low-Rank Adaption over a Flat Loss Landscape
improved-sift-RANSAC-detection
利用改进过的sift+RANSAC算法实现生产线监控视频中主板的实时框定
mARWP
[TMLR 2024] Revisiting Random Weight Perturbation for Efficiently Improving Generalization
MEHL-Soup
[ECCV 2024] Learning Scalable Model Soup on a Single GPU: An Efficient Subspace Training Strategy
RWP
Sub-AT
[CVPR 2022 oral] Subspace Adversarial Training
TWA
[ICLR 2023] Trainable Weight Averaging: Efficient Training by Optimizing Historical Solutions
nblt's Repositories
nblt/DLDR
[TPAMI 2023] Low Dimensional Landscape Hypothesis is True: DNNs can be Trained in Tiny Subspaces
nblt/Sub-AT
[CVPR 2022 oral] Subspace Adversarial Training
nblt/TWA
[ICLR 2023] Trainable Weight Averaging: Efficient Training by Optimizing Historical Solutions
nblt/F-SAM
[CVPR 2024] Friendly Sharpness-Aware Minimization
nblt/RWP
nblt/mARWP
[TMLR 2024] Revisiting Random Weight Perturbation for Efficiently Improving Generalization
nblt/MEHL-Soup
[ECCV 2024] Learning Scalable Model Soup on a Single GPU: An Efficient Subspace Training Strategy
nblt/improved-sift-RANSAC-detection
利用改进过的sift+RANSAC算法实现生产线监控视频中主板的实时框定
nblt/DLDR_imagenet
nblt/Flat-LoRA
Flat-LoRA: Low-Rank Adaption over a Flat Loss Landscape
nblt/Graduation-project
• 研究神经网络参数分布和模型特性的关系 • 研究模型鲁棒性和隐私性和随机性的关系 • 利用生成模型对参数分布进行学习
nblt/nblt.github.io
AcadHomepage: A Modern and Responsive Academic Personal Homepage
nblt/Orthogonal-Multi-Path
code for the paper Learn Robust Features via Orthogonal Multi-Path (https://arxiv.org/abs/2010.12190)
nblt/pytorch_resnet_cifar10
Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.
nblt/Tetris-AI
An ai considering the next step
nblt/nblt_test.github.io
nblt/WeTS
A benchmark for the task of translation suggestion