second-order-optimization
There are 42 repositories under second-order-optimization topic.
amirgholami/PyHessian
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
amirgholami/adahessian
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
EmbersArc/Epigraph
A C++ interface to formulate and solve linear, quadratic and second order cone problems.
lixilinx/psgd_torch
Pytorch implementation of preconditioned stochastic gradient descent (affine group preconditioner, low-rank approximation preconditioner and more)
gpauloski/kfac-pytorch
Distributed K-FAC Preconditioner for PyTorch
CharlieDinh/FEDL
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
CharlieDinh/FEDL_pytorch
This repository implements FEDL using pytorch
lixilinx/psgd_tf
Tensorflow implementation of preconditioned stochastic gradient descent
ltatzel/PyTorchHessianFree
PyTorch implementation of the Hessian-free optimizer
tomoleary/hessianlearn
Hessian-based stochastic optimization in TensorFlow and keras
kenshi84/compatible-intrinsic-triangulations
Compatible Intrinsic Triangulations (SIGGRAPH 2022)
OPTAMI/OPTAMI
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
CharlieDinh/DONE
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
evanatyourservice/kron_torch
An implementation of PSGD Kron second-order optimizer for PyTorch
AntoinePassemiers/Beyond-Gradient-Descent
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
evanatyourservice/psgd_jax
Implementation of PSGD optimizer in JAX
jmdvinodjmd/LIBS2ML
LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms
hiroyuki-kasai/Subsampled-RTR
Subsampled Riemannian trust-region (RTR) algorithms
tomoleary/MatrixFreeNewton.jl
Prototyping of matrix free Newton methods in Julia
gmatilde/SGN
An efficient and easy-to-use Theano implementation of the stochastic Gauss-Newton method for training deep neural networks.
hsivan/fosi
FOSI library for improving first order optimizers with second order information
dahyun-kang/newoptpy
Newton’s second-order optimization methods in python
IST-DASLab/EFCP
The repository contains code to reproduce the experiments from our paper Error Feedback Can Accurately Compress Preconditioners available below:
qiuweili/altmin
Second-Order Convergence of Alternating Minimizations
yorkerlin/StructuredNGD-DL
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
yangorwell/NGPlus
NG+: A new second-order optimizer for deep learning
cor3bit/somax
Stochastic Second-Order Methods in JAX
evanatyourservice/flat-sophia
sophia optimizer further projected towards flat areas of loss landscape
evanatyourservice/ordax
A collection of second-order optimizers and experiments in JAX
sayarghoshroy/Optimization_and_Learning
Concepts and algorithms in core learning theory
aniruddhavpatil/second-order-model-selection
Regularization, Bayesian Model Selection and k-fold Cross-Validation Selection
cor3bit/awesome-soms
A curated list of resources for second-order stochastic optimization
riccardocadei/adahessian
Discussion of advantages and disadvantages of AdaHessian, a state-of-the-art Second Order Methods over First Order Methods on a Non-Convex Optimization Problem (digits classification on MNIST database using ResNet18). - @ EPFL
vinnik-dmitry07/full-batch
Super-Convergence on CIFAR10