NaHenn's Stars
jettify/pytorch-optimizer
torch-optimizer -- collection of optimizers for Pytorch
goodfeli/dlbook_notation
LaTeX files for the Deep Learning book notation
hiroyuki-kasai/SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
google/spectral-density
Hessian spectral density estimation in TF and Jax
lixilinx/psgd_torch
Pytorch implementation of preconditioned stochastic gradient descent (affine group preconditioner, low-rank approximation preconditioner and more)
LeoYu/neural-tangent-kernel-UCI
Testing Nerual Tangent Kernel (NTK) on small UCI datasets
BallisticLA/RandLAPACK
A high-performance C++ library for randomized numerical linear algebra
BallisticLA/parla
Python Algorithms for Randomized Linear Algebra
keskarnitish/minSQN
Optimization using Stochastic quasi-Newton methods
timlautk/BCD-for-DNNs-PyTorch
Code for Global Convergence of Block Coordinate Descent in Deep Learning (ICML 2019)
OptMLGroup/SQN
Sampled Quasi-Newton Methods for Deep Learning
gowerrobert/StochOpt.jl
A suite of stochastic optimization methods for solving the empirical risk minimization problem.
dalab/subsampled_cubic_regularization
Source code for "Sub-sampled Cubic Regularization for Non-convex Optimization", JM Kohler, A Lucchi, https://arxiv.org/abs/1705.05933
BallisticLA/marla
Matlab Algorithms for Randomized Linear Algebra
git-xp/Non-Convex-Newton
leventsagun/hessian-for-basicDL
can calculate the Hessian matrix and/or its spectrum for simple neural nets
vojha-code/Multi-Output-Neural-Tree
We propose an algorithm and a new method to tackle the classification problems. We propose a multi-output neural tree (MONT) algorithm, which is an evolutionary learning algorithm trained by the non-dominated sorting genetic algorithm (NSGA)-III.
gowerrobert/StochOptMatlab
A suite of stochastic optimization methods, including the stochastic block BFGS method, for minimizing an average of functions (empirical risk minimization)
nathansiae/Stochastic-Average-Newton
zouzias/REK-C
Randomized Extended Kaczmarz (C)
amkatrutsa/preckacz
Implementation of the preconditioned Kaczmarz method https://arxiv.org/abs/1903.01806
sgoldt/nn2pp
Utilities to train two-layer neural networks
hfassold/omni_optimizer
A wrapper for several SoA adaptive-gradient optimizer, including my novel 'AdaFamily' algorithm
hfassold/nlp_finetuning_adafamily
Demonstrates finetuning of a NLP model with novel 'AdaFamily' optimizer and 'mini-batch trimming'
LIONS-EPFL/Subquadratic-Overparameterization
Code for Subquadratic Overparameterization for Shallow Neural Networks NeurIPS 2021.
OxfordML/MUBs_TraceEstimators
A python implementation of the MUBs Stochastic Trace Estimator.
stats285/Alpha
Alpha pattern for XYZW based Science.
vojha-code/Neural-Tree-Software
Software: The Adaptive Approximation Software Toolbox (Neural Tree Algorithm) is a function approximation and feature selection tool that uses genetic programming for constructing tree like structure to construct an adaptive multi-layer perceptron. This standalone software toolbox solves prediction problems. The developed algorithm performs multiobjective in which it adaptively creates simple models with high generalization ability.
kiranchhatre/BEAM-Bayes-Opt
[LOD 2022] Parallel Bayesian Optimization of Multi-agent Systems
yutongLi1997/Similarity-Fusion-GRMF