inikishev's Stars
pytorch/pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
numpy/numpy
The fundamental package for scientific computing with Python.
scipy/scipy
SciPy library main repository
optuna/optuna
A hyperparameter optimization framework
facebookresearch/nevergrad
A Python toolbox for performing gradient-free optimization
CPJKU/madmom
Python audio and music signal processing library
facebookresearch/optimizers
For optimization algorithm research and development.
brain-research/guided-evolutionary-strategies
Guided Evolutionary Strategies
TorchJD/torchjd
Library for Jacobian descent with PyTorch. It enables optimization of neural networks with multiple losses (e.g. multi-task learning).
lixilinx/psgd_torch
Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
ClashLuke/HeavyBall
Efficient optimizers
CPJKU/beat_this
Accurate and general beat tracker
tuero/perturbations-differential-pytorch
Differentiable Optimizers with Perturbations in Pytorch
PiotrDabkowski/torchpwl
Piecewise Linear Functions (PWL) implementation in PyTorch
lessw2020/FAdam_PyTorch
an implementation of FAdam (Fisher Adam) in PyTorch
konstmish/opt_methods
Benchmarking optimization methods on convex problems.
TheMody/No-learning-rates-needed-Introducing-SALSA-Stable-Armijo-Line-Search-Adaptation
SaLSa Optimizer implementation (No learning rates needed)
UbiquitousLearning/Backpropagation_Free_Training_Survey
f-dangel/sirfshampoo
[ICML 2024] SIRFShampoo: Structured inverse- and root-free Shampoo in PyTorch (https://arxiv.org/abs/2402.03496)
MathIsAll/ZO-AdaMU
This project is a implementation in PyTorch for ZO-AdaMU optimization: Adapting Perturbation with the Momentum and Uncertainty in Zeroth-order Optimization.
BaoWangMath/DP-LSSGD
kad99kev/PyBoxCar
BoxCar2D implementation in Python.
EliaFantini/ZO-AdaMM-vs-FO-AdaMM-convergence-and-minima-shape-comparison
Implementation and comparison of zero order vs first order method on the AdaMM (aka AMSGrad) optimizer: analysis of convergence rates and minima shape
Jvictormata/adasub
Ivonne320/ZO-SGD_vs_FO-SGD
0x4249/NNAIF
Python code for running the numerical experiments in the paper "Neural Network Accelerated Implicit Filtering: Integrating Neural Network Surrogates With Provably Convergent Derivative Free Optimization Methods" by Brian Irwin, Eldad Haber, Raviv Gal, and Avi Ziv.
sukiboo/smoothing_based_optimization
Implementation of smoothing-based optimization algorithms
andreakiro/zeroptim
Zero-order optim in deep learning
bumsu-kim/CARS_Refactored
Curvature-Aware Random Search
CorgiTaco/Feature-Recycler
Fixes the "Feature order cycle" error that occurs when various mods add placed features in different orders between their biomes.