f-dangel
Postdoc at the Vector Institute, Toronto. Interested in computing and using anything beyond the gradient for ML.
@ProbabilisticNumerics @VectorInstituteToronto
Pinned Repositories
backpack
BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.
cockpit
Cockpit: A Practical Debugging Tool for Training Deep Neural Networks
curvlinops
scipy linear operators for the Hessian, Fisher/GGN, and more in PyTorch
einconv
Convolutions and more as einsum for PyTorch
hbp
Hessian backpropagation (HBP): PyTorch extension of backpropagation for block-diagonal curvature matrix approximations
phd-thesis
Source code for my PhD thesis: Backpropagation Beyond the Gradient
phd-thesis-template
LaTeX template for my PhD thesis at the University of Tuebingen
singd
[ICML 2024] SINGD: KFAC-like Structured Inverse-Free Natural Gradient Descent (http://arxiv.org/abs/2312.05705)
unfoldNd
(N=1,2,3)-dimensional unfold (im2col) and fold (col2im) in PyTorch
vivit
[TMLR 2022] Curvature access through the generalized Gauss-Newton's low-rank structure: Eigenvalues, eigenvectors, directional derivatives & Newton steps
f-dangel's Repositories
f-dangel/backpack
BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.
f-dangel/cockpit
Cockpit: A Practical Debugging Tool for Training Deep Neural Networks
f-dangel/unfoldNd
(N=1,2,3)-dimensional unfold (im2col) and fold (col2im) in PyTorch
f-dangel/phd-thesis
Source code for my PhD thesis: Backpropagation Beyond the Gradient
f-dangel/hbp
Hessian backpropagation (HBP): PyTorch extension of backpropagation for block-diagonal curvature matrix approximations
f-dangel/singd
[ICML 2024] SINGD: KFAC-like Structured Inverse-Free Natural Gradient Descent (http://arxiv.org/abs/2312.05705)
f-dangel/vivit
[TMLR 2022] Curvature access through the generalized Gauss-Newton's low-rank structure: Eigenvalues, eigenvectors, directional derivatives & Newton steps
f-dangel/curvlinops
scipy linear operators for the Hessian, Fisher/GGN, and more in PyTorch
f-dangel/einconv
Convolutions and more as einsum for PyTorch
f-dangel/phd-thesis-template
LaTeX template for my PhD thesis at the University of Tuebingen
f-dangel/sirfshampoo
[ICML 2024] SIRFShampoo: Structured inverse- and root-free Shampoo in PyTorch (https://arxiv.org/abs/2402.03496, WIP)
f-dangel/backobs
Use DeepOBS with BackPACK
f-dangel/org-export-setup
My org-export settings
f-dangel/python-utilities
Python utility functions I often use
f-dangel/cockpit-experiments
Experiments for the NeurIPS 2021 paper "Cockpit: A Practical Debugging Tool for the Training of Deep Neural Networks"
f-dangel/DeepOBS
DeepOBS: A Deep Learning Optimizer Benchmark Suite
f-dangel/org-html-themes
How to export Org mode files into awesome HTML in 2 minutes
f-dangel/vivit-experiments
Experiments for the TMLR 2023 paper "ViViT: Curvature Access Through the Generalized Gauss-Newton’s Low-rank Structure"
f-dangel/backpack-experiments
Experiments code for "BackPACK: Packing more into Backrop" [ICLR 2020]
f-dangel/bibliography-file
My global `.bib` file to store bibliographic records
f-dangel/PyHessian
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
f-dangel/PyTorchHessianFree
PyTorch implementation of the Hessian-free optimizer
f-dangel/StructuredNGD-DL
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning