loss-landscape
There are 15 repositories under loss-landscape topic.
xxxnell/how-do-vits-work
(ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
logancyang/loss-landscape-anim
Create animations for the optimization trajectory of neural nets
ayulockin/LossLandscape
Explores the ideas presented in Deep Ensembles: A Loss Landscape Perspective (https://arxiv.org/abs/1912.02757) by Stanislav Fort, Huiyi Hu, and Balaji Lakshminarayanan.
sayakpaul/Sharpness-Aware-Minimization-TensorFlow
Implements sharpness-aware minimization (https://arxiv.org/abs/2010.01412) in TensorFlow 2.
shamsbasir/investigating_mitigating_failure_modes_in_pinns
This repository contains the code and models for our paper "Investigating and Mitigating Failure Modes in Physics-informed Neural Networks(PINNs)"
VITA-Group/LTH-Pass
[TMLR] "Can You Win Everything with Lottery Ticket?" by Tianlong Chen, Zhenyu Zhang, Jun Wu, Randy Huang, Sijia Liu, Shiyu Chang, Zhangyang Wang
gg-dema/Git_merge
analytic solution to the git-merge algorithm, derived from "Git Re-Basin: Merging Models modulo Permutation Symmetries"
isadrtdinov/understanding-large-lrs
Source code for NeurIPS-2024 paper "Where Do Large Learning Rates Lead Us"
sungyoon-lee/LossLandscapeMatters
[NeurIPS 2021] Towards Better Understanding of Training Certifiably Robust Models against Adversarial Examples | ⛰️⚠️
fanghenshaometeor/ood-mode-ensemble
[Int. J. Comput. Vis. 2024] Revisiting Deep Ensemble for Out-of-Distribution Detection: A Loss Landscape Perspective
HJHGJGHHG/Optimizer-papers
Worth-reading papers and related awesome resources on deep learning optimization algorithms. 值得一读的深度学习优化器论文与相关资源。
mortfer/keras-gsam
Surrogate Gap Guided Sharpness-Aware Minimization (GSAM) implementation for keras/tensorflow 2
pxl-th/Yama.jl
Visualize loss landscape
francesco-innocenti/pc-saddles
Code for NeurIPS 2024 paper "Only Strict Saddles in the Energy Landscape of Predictive Coding Networks?"
HuanranLi/Grokking-in-Transformer
This project builds on recent research that explores the phenomenon of Grokking. The goal is to investigate when, why, and how grokking occurs, focusing on transformers under various batch sizes.